From multi cohort Phase I’s to integrated Phase III’s, FDA eyes efficiency

Two guidance documents, released a month apart this summer, suggest that the FDA is taking up a concern that sponsors and patients feel all too acutely: the need for speed.

That’s not a word anyone involved in clinical research takes lightly, with lives and knowledge on the line. But by issuing guidance on EHR integration in July, followed this month by guidance on simultaneous cohorts in Phase I oncology trials, the United States’ foremost regulatory body is placing a premium on efficiency. It’s not fair, or accurate, to claim that the policymakers at the FDA have never been interested getting safe, effective medicines to market quickly.  Like the population they serve, these policymakers are moms and dads, brothers and sisters, and patients. But they are also stalwarts of caution. That’s why their recent guidance communicates so much, even beyond its content.

First, what is the content? July’s guidance document, Use of Electronic Health Record Data in Clinical Investigations,  “clarifies [the] FDA’s expectations when EHRs are used as a source of data in [prospective] clinical investigations.” Unabashedly, the guidance “encourages sponsors and clinical investigators to work with entities that control EHR systems, such as health care organizations, to use EHR and EDC systems that are interoperable or fully integrated.” (The guidance defines interoperable systems as those capable of piping data from one to the other through a validated process. Think of a well-tested and well-documented API. Fully integrated systems, by contrast, “allow clinical investigators to enter research data directly into the EHR.”) Guidance sections with the most specific recommendations include those on:

  • structured and unstructured data (the first being preferred, but the second allowed if additional reliability and quality measures are in place)
  • validation (required for the EDC, but not for the EHR)
  • data from multiple EHR systems (“data from another institution’s EHR system may be used and transmitted to the sponsor’s EDC system provided that data sharing agreements are in place”)
  • ONC Health IT Certification for EHR system (“FDA encourages the use of such certified EHR systems”)
  • data custody and audit trails
  • maintaining the blind across all relevant systems
  • Informed Consent that makes the use of EHR data clear

The recommendations are sound, if not surprising. But the scope, which includes prospective studies but excludes postmarketing and registry ones, signals a focus on investigations designed to make new therapies available, or existing therapies available for new indications. A need, in other words, for speed.

What about the August guidance? Expansion Cohorts: Use in First-In-Human Clinical Trials to Expedite Development of Oncology Drugs and Biologics aims to bring order–and a seal of acceptability–to trials that are “intended to expedite development by seamlessly proceeding from initial determination of a potentially effective dose to individual cohorts that have trial objectives typical of phase 2 trials (i.e., to estimate anti-tumor activity).” These trials pose particular risks, most salient among them the possibility of exposing more participants than necessary to toxic or suboptimal doses. Because of these risks, sponsors should only consider an expansion cohort design for studies related to indications without curative therapies.  What’s more, to mitigate these risks, the guidance states that “it is imperative that sponsors establish an infrastructure to streamline trial logistics, facilitate data collection, and incorporate plans to rapidly assess emerging data in real time and to disseminate interim results to investigators, institutional review boards (IRBs), and regulators.”

Sections V, VI, and VII, dealing with considerations based on cohort objectives, statistical considerations, and safety considerations respectively, don’t lend themselves to summary, mandating study conduct requirements for physicians, pharmacologists, pharmacovigilance officers and biostatisticians. (To get a sense of the detail, note that both guidance on Food-Effect Bioavailability and Fed Bioequivalence Studies and Pharmacokinetics in Patients With Impaired Renal Function — Study Design, Data Analysis, and Impact on Dosing and Labeling are pre-requirements.) That such specific directives are issued here in a draft guidance hints at an urgency understood: getting quickly to Phase III safety and efficacy findings starts with a Phase I that can transition quickly transition to a Phase II.

Beyond the text of these documents, the FDA’s response to the industry’s need for speed is audible, and commendable. Why did the response (or so explicit a response) come now? We can only speculate. Certainly, the roads to knowledge are getting faster by the day. Just consider the early successes of machine learning. And despite some therapeutic victories, cancer is by no means in retreat. The will and the way are clear. The FDA has just supplied a good deal of the how.


Unsure whether your data capture system is build for speed? Versatile study build features, a robust API, and forms that drive “quality on entry” are all must haves.

 

“You’ve Got Findings!”: To the 90’s and Back with GCP E6(R2)

dowload GCP; download GDPR; download 21 CRF 11; regulatory documentsImagine a time when phones boasted no smarts. When technophiles worshipped noisy gadgets called modems. When The Human Genome Project sounded (to most ears) like the name of a progressive rock band.

Not to your ears, though. Suppose further that you’re an accomplished research professional of the era, medically trained, with degrees and vision to spare. Someday, you’re certain, paper records will give way to more efficient storage and retrieval. Zip disks, maybe? And this digital highway that’s all the rage, the “Web”, well, who knows? When adequate security measures are in place, maybe sites could email a digital CRF to the data manager.

 

A relic from pre-history (i.e. before Facebook timelines). Scholars are still uncertain of the fate of this “virtual settlement.”

It’s that foresight which, in the mid-1990s, has earned you membership to an international committee of your peers; a committee tasked with devising guidelines to “facilitate the mutual acceptance of clinical data” by regulatory authorities in Europe, Japan, and the United States. At least, that’s the operational goal, and a worthy one, too. You started in this business to push the best research out of silos and ivory towers and into the real world. But that mission is fraught with potential dangers. Not forty years prior, inadequate testing of an immunomodulatory drug led to the births of more than 10,000 children with limb malformations in Germany. Just three years ago, in a trial conducted by the NIH, five participants died of liver toxicity following experimental treatment for hepatitis B. Whatever standard you propose for maximizing the benefits of clinical research, it had better provide “public assurance that the rights, safety and well-being of trial subjects are protected.” After all, your committee is bound by the ethical good in its pursuit of the clinial good. You suggest calling the standard Good Clinical Practice.

Welcome to the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use. And welcome to 1996.

But don’t get comfortable. This is the story of GCP–more precisely, the Guideline for Good Clinical Practice–and it is headed right back to the present, with its first amendment already in force in Europe. Why did the ICH amend its original guideline? Why twenty years later? There’s no single answer, but all the best answers revolve around one theme. Just as our phones have gone on to earn a postdoc in the last 22 years, the possibilities for research have matured, too. What’s more, their modern histories are linked, with technological breakthroughs inspiring the search for clinical ones, and clinical triumphs spurring ever more capable technology. Behind that push and pull, the need to ensure safety and quality remains constant. The story of the GCP’s amendment is a story of time-honored values upheld in new ways.

Preserving the Past

In a literal sense, E6(R1) hasn’t been replaced at all. The authors of E6(R2) choose an integrated addendum format when drafting the updated guidance. The original language, with all that it mandates, remains:

The addendum text clarifies the scope and meaning of preceding terms. Twenty years ago, the word “storage” brought to mind file cabinets. Early in this millenium, we might of have pictured hard drive volumes. Today, our heads are in the cloud. But in all of these contexts, storage that is secure and accessible is a must. It’s how we achieve it that differs.

Embracing the New

But just how do we achieve it–meaning everything from security to safety to data quality–in 2018, when so many tasks once considered part of an “honest day’s work” are now specialties of automation, algorithms, and analytics?

Not even the full sixty pages of the document provide a specific, once-and-forever answer. (As a guideline, it shouldn’t.) What E6(R2) does propose is a 21st-century mantra for maximizing safety and quality. The mantra sounds like this:

“Oversee it all. Take action on what matters.”

There are some loaded terms in this phrase. “Oversee” might translate to “maintain real-time digital access,” particularly in the case of processes occurring daily all over the world, such as eConsent. And “take action” needs to cover both proactive and corrective measures. But with those common sense glosses in mind, we could do worse than take the directive above as the crux of E6(R2). What evidence do we have for this reading? Before we look at specific clauses in R2, we can gain a strong sense of how the amendment differs from the original by looking at the frequency of key terms.

Frequency of key terms and contexts; ICH GCP E6(R2); ICH GCP E6(R1)
Clues that the GCP has grasped an important truth. The sheer volume of data now available to us raises probabilistic, “big data”-inspired thinking to a virtue.

Terms like “risk-based” simply weren’t part of the vernacular in 1996, so it’s no surprise that they should make their first appearance only now. But the concept of risk is as old as modern, statistically-informed research itself. So why does the word itself occur twice as frequently in R2 as opposed to R1?

The answer is that risk is omnipresent now, not primarily in the sense of an unintended consequence, but as a factor in decision-making. How did this come to be the case? Along almost any dimension we consider–target enrollment, sources of data, self-reports–research is doing, and producing, more than it ever has. In some cases, like the breadth of genomic factors analyzed, research is doing more than we thought possible back in 1996. On the other hand, the number of hours in a day has remained disconcertingly flat (anyone working on this?). Human cognitive capacities and attentional reserves likewise remain more or less the same. Our technological capacities have grown in orders of magnitude, and that’s all good. But until self-organizing, self-monitoring trials (powered by AI) are the norm, we humans will continue to serve as the chief executives of research.

While the amendment stops shy of saying it explicitly, R2 recognizes that distributing our time and attention equally among all processes works against safety and quality. That’s because some studies are now so complex, or collect so much data, that line-by-line “box checking” not only becomes impractical, it distracts us from those risks that only become apparent on a “big data”-like view of key metrics. In other words, it’s crucial that we see the risk forest for the data element trees. That’s the message behind much of the amendment text:

From the Introduction:

From section 2, “The Principles of GCP”:

From section 5, “Sponsor”:

Does this mean less rigor in oversight? Just the opposite. The GCP amendment will require more vigilance from all parties, from sponsors and sites to CROs and vendors. It means bringing alertness and analysis to bear in order to find the boxes, not just check them. This isn’t the ICH throwing up its hands now that the “scale, complexity, and cost of clinical trials have increased.” It’s the ICH demanding that we learn, and practice, new survival skills in a new world.

So drop the Nintendo controller. Time to pick up some neural implants.

What is GDPR (And Why Should I Care)?

Some laws govern the use of crosswalks; others, how to clean up after our pet in public spaces.

Then there’s the EU General Data Protection Regulation (EU GDPR).

Now, safety considerations, not to mention lawful conduct, urge us to cross the street along designated paths. (Just as common courtesy persuades most of us to scoop up our terrier’s natural fertilizer from the sidewalk.) But we all let distraction or the rush of the workday get the better of us from time to time. The costs of ignoring the law are small in this case. Rules against crossing a street outside of a crosswalk are rarely enforced here in the United States. If law enforcement does catch us in the act, punishment ranges from a warning to a small fine.

A repeat violation of the EU GDPR, on the other hand, could cost the guilty party 20,000,000 Euro.

So how do you stay in between the lines when it comes to this regulation? Knowledge is the best defense, so without any further ado, here are the basic facts you need to know.

What does the GDPR aim to accomplish?

To quote the homepage of eugdpr.org: “The EU General Data Protection Regulation (GDPR) … was designed to harmonize data privacy laws across Europe, to protect and empower all EU citizens’ data privacy, and to reshape the way organizations across the region approach data privacy.”

This is vague, but the second clause does highlight the regulation’s main purpose; namely, “to protect and empower all EU citizens’ data privacy.” Harmonization of data privacy laws may be a boon to data-gathering entities operating in multiple EU countries. But given the amount of text proclaiming the rights and freedoms of data subjects (or stating the duties of data controllers and processors in upholding those rights and freedom), the motivation of the GDPR is clear. This regulation is about individuals’ rights over information about themselves; when it may be obtained, how it must be protected, and what may or may not be done with it.

Chapter 1, Article 1 of the official regulation (“Subject-matter and objectives”) makes this clear in more legal-sounding language:

1. This Regulation lays down rules relating to the protection of natural persons with regard to the processing of personal data and rules relating to the free movement of personal data.

2. This Regulation protects fundamental rights and freedoms of natural persons and in particular their right to the protection of personal data.

3. The free movement of personal data within the Union shall be neither restricted nor prohibited for reasons connected with the protection of natural persons with regard to the processing of personal data.

What is “personal data”?

Article 4 sets out 26 definitions, and it’s no coincidence that “personal data” is the first: For the purposes of this Regulation:

1. ‘personal data’ means any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person;

Worth noting is the reference to “an online identifier”. The regulation considers IP addresses and cookie strings personal data.

But a legal-sounding definition doesn’t capture the sanctity with which personal data is regarded in the EU. With the exceptions of sensitive health and financial information, data about a person in the U.S. is subject to the principle of “finders keepers” (de facto if not by de jure). Corporations routinely lay claim to personal data through an obscure “terms of use” page on their website, or the failure of a customer to explicitly deny the corporation the right to collect his or her data. In Europe, personal data is an aspect of personal dignity. The GDPR is, among other things, an insistence on this cultural fact in light of an increasingly global and data-driven economy.

Who is obligated to follow it?

The GDPR casts a wide net. All persons or organizations that are reasonably construed as either a “data controller” or “data processor” (regardless of whether data control or processing are that entity’s primary function) are subject to the regulation, if any one of three conditions apply.

Who or what constitutes a “data controller”?

The “data controller” is the “natural or legal person, public authority, agency or other body which, alone or jointly with others, determines the purposes and means of the processing of personal data.” Typically, this is the entity at the top of the “data solicitation chain”; in the area of clinical research, the sponsor, academic institution, or CRO/ARO.

Who or what constitutes a “data processor”?

The data processor is “a natural or legal person, public authority, agency or other body which processes personal data on behalf of the controller.” Those who play any role in the administration of a database, including software vendors in those cases where the database is digital, are data processors.

What are the conditions under which a data controller or data processor is bound by the GDPR?

If any one of the following conditions obtain for a data controller or data processor, that entity is bound by the GDPR:

  1. The data controller is based in the European Union (regardless of where the data subject is based)
  2. The data processor operating on behalf of the data controller is based in the European Union (regardless of where the data subject is based)
  3. The data subject is based in the European Union

Practically, the safest default assumption is that your research operations are bound by the GDPR. If any participant in your study resides in the EU, or any link in the chain of data custody passes through the EU, or your organization is based in the EU, the GDPR’s applicability is clear.

What must those persons or entities do?

GDPR mandates are best thought of as duties that data controllers and processors have in upholding the rights of data subjects. Articles 12 through 23 enumerate the rights of the data subject. No summary is adequate to convey all of the particular rights, and for that reason it is incumbent on all data controllers and processors to read, understand, and abide by practices which uphold these rights. But for the purposes of this primer, we can think of these rights are granting persons the following powers and assurances.

Powers

  • To provide only those elements of personal data they agree to provide, having been fully informed of the purposes and risks of the data collection
  • To provide such data only to those entities he or she wishes
  • To rectify or request the erasure of their personal data
  • To access the personal data collected from them in a readable and standardized format (note that this does not necessarily mean in a native spoken language)
  • To contest significant decisions affecting them (e.g., those of employment or legal action) that are computed solely by an algorithm operating on their personal data
  • To seek legal redress for the failure of a data controller or data processor to respect these powers or to maintain the following assurances

Assurances

  • The data subject shall not be identifiable from the personal data, through use of secure pseudonymization protocols (e.g. assigning an alphanumeric identifier to either a data subject and/or an element of their personal data, from which publicly identifying information such as the subject’s name, NHS number, address, or birthday cannot be deduced)
  • The data subject will be immediately informed of any breach of their data privacy
  • The data subject’s personal data shall be consulted and processed only for those those purposes disclosed to the data subject as part of obtaining his or her informed consent
  • Data controllers shall request from the data subject only those elements personal data that are essential to the purposes made explicit during the process of informed consent (i.e. data minimization)

What duties do these powers and assurances incur for data controllers and processors? The concept of “data protection by design and default” is useful, if general, place to start. Before data collection begins, data controllers and processors must establish and document systems and practices that:

  • make it clear to the data subject which elements of their personal data the controller or processor is requesting, and for what purposes
  • make it clear to the data subject which persons or entities will have access to their data
  • maintain the privacy of personal data, e.g., through pseudonymization, data encryption, physical security, etc.
  • prevent the collection of data that is immaterial to the purpose of data gathering

Which sorts of systems and practices qualify as achieving those aims? The answer the regulation gives is, unfortunately, something of a placeholder. Article 42 offers data controllers and processors the opportunity to “certify” their data protection mechanisms, but the certification bodies and requirements for certification are all unspecified. (Paragraph 3 even states that “the certification shall be voluntary.”)

For better or worse, data controllers and processors seem to bear the burden of selecting – and justifying to unspecified “certification bodies” – the technical criteria by which the GDPR will assess their data protection measures.

This is perhaps both a problem and an opportunity. Better that minimum encryption standards, for instance, go unspecified (for now) than be made to conform to some arbitrary decree. As data controllers and processors, we can take an active role in establishing these and other criteria in a way serves data protection and efficient data flow.

When does the regulation go into effect?

The regulation becomes effective on Friday, May 25th, 2018. This is a universal implementation date: national governments do not have authority to legislate a later (or earlier) effective date.

Who is in charge of enforcing it?

The European Parliament, the Council of the European Union and the European Commission are the governing bodies with respect to the EU GDPR.

What are the penalties for non-compliance?

If you’re looking for concreteness among all the abstraction of the GDPR, look no further than Article 83, “General conditions for imposing administrative fines.” All nine paragraphs set forth factors in arriving at a sanction for negligence or willful violation. But paragraph 6 will continue to attract the most attention: “(6) Non-compliance […] shall […] be subject to administrative fines up to 20 000 000 EUR, or in the case of an undertaking, up to 4 % of the total worldwide annual turnover of the preceding financial year, whichever is higher.”

Is that all there is to GDPR?  

Unfortunately no. If regulatory compliance were as easy as nodding along to a blog post, we’d never hear of violations. Then again, we’d hear about a far more, and more severe, privacy breaches. Remaining compliant with all of the regulations than bear on clinical research may be a logistic burden, but it’s the right thing to do. You wouldn’t knowingly expose a patient (or their identity) to harm, but that’s what non-compliance amounts to: thousands of seemingly minor risks that make at least one catastrophe almost inevitable. So continue to educate yourself. We’ll help in that effort with a series of blog posts that begins with this one. And if the moral imperative of compliance doesn’t motivate you, consider the impact that non-compliance could have on your business or organization. You really don’t want to step in this natural fertilizer.