What is GDPR (And Why Should I Care)?

Some laws govern the use of crosswalks; others, how to clean up after our pet in public spaces.

Then there’s the EU General Data Protection Regulation (EU GDPR).

Now, safety considerations, not to mention lawful conduct, urge us to cross the street along designated paths. (Just as common courtesy persuades most of us to scoop up our terrier’s natural fertilizer from the sidewalk.) But we all let distraction or the rush of the workday get the better of us from time to time. The costs of ignoring the law are small in this case. Rules against crossing a street outside of a crosswalk are rarely enforced here in the United States. If law enforcement does catch us in the act, punishment ranges from a warning to a small fine.

A repeat violation of the EU GDPR, on the other hand, could cost the guilty party 20,000,000 Euro.

So how do you stay in between the lines when it comes to this regulation? Knowledge is the best defense, so without any further ado, here are the basic facts you need to know.

What does the GDPR aim to accomplish?

To quote the homepage of eugdpr.org: “The EU General Data Protection Regulation (GDPR) … was designed to harmonize data privacy laws across Europe, to protect and empower all EU citizens’ data privacy, and to reshape the way organizations across the region approach data privacy.”

This is vague, but the second clause does highlight the regulation’s main purpose; namely, “to protect and empower all EU citizens’ data privacy.” Harmonization of data privacy laws may be a boon to data-gathering entities operating in multiple EU countries. But given the amount of text proclaiming the rights and freedoms of data subjects (or stating the duties of data controllers and processors in upholding those rights and freedom), the motivation of the GDPR is clear. This regulation is about individuals’ rights over information about themselves; when it may be obtained, how it must be protected, and what may or may not be done with it.

Chapter 1, Article 1 of the official regulation (“Subject-matter and objectives”) makes this clear in more legal-sounding language:

1. This Regulation lays down rules relating to the protection of natural persons with regard to the processing of personal data and rules relating to the free movement of personal data.

2. This Regulation protects fundamental rights and freedoms of natural persons and in particular their right to the protection of personal data.

3. The free movement of personal data within the Union shall be neither restricted nor prohibited for reasons connected with the protection of natural persons with regard to the processing of personal data.

What is “personal data”?

Article 4 sets out 26 definitions, and it’s no coincidence that “personal data” is the first: For the purposes of this Regulation:

1. ‘personal data’ means any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person;

Worth noting is the reference to “an online identifier”. The regulation considers IP addresses and cookie strings personal data.

But a legal-sounding definition doesn’t capture the sanctity with which personal data is regarded in the EU. With the exceptions of sensitive health and financial information, data about a person in the U.S. is subject to the principle of “finders keepers” (de facto if not by de jure). Corporations routinely lay claim to personal data through an obscure “terms of use” page on their website, or the failure of a customer to explicitly deny the corporation the right to collect data his or her data. In Europe, personal data is an aspect of personal dignity. The GDPR is, among other things, an insistence on this cultural fact in light of an increasingly global and data-driven economy.

Who is obligated to follow it?

The GDPR casts a wide net. All persons or organizations that are reasonably construed as either a “data controller” or “data processor” (regardless of whether data control or processing are that entity’s primary function) are subject to the regulation, if any one of three conditions apply.

Who or what constitutes a “data controller”?

The “data controller” is the “natural or legal person, public authority, agency or other body which, alone or jointly with others, determines the purposes and means of the processing of personal data.” Typically, this is the entity at the top of the “data solicitation chain”; in the area of clinical research, the sponsor, academic institution, or CRO/ARO.

Who or what constitutes a “data processor”?

The data processor is “a natural or legal person, public authority, agency or other body which processes personal data on behalf of the controller.” Those who play any role in the administration of a database, including software vendors in those cases where the database is digital, are data processors.

What are the conditions under which a data controller or data processor is bound by the GDPR?

If any one of the following conditions obtain for a data controller or data processor, that entity is bound by the GDPR:

  1. The data controller is based in the European Union (regardless of where the data subject is based)
  2. The data processor operating on behalf of the data controller is based in the European Union (regardless of where the data subject is based)
  3. The data subject is based in the European Union

Practically, the safest default assumption is that your research operations are bound by the GDPR. If any participant in your study resides in the EU, or any link in the chain of data custody passes through the EU, or your organization is based in the EU, the GDPR’s applicability is clear.

What must those persons or entities do?

GDPR mandates are best thought of as duties that data controllers and processors have in upholding the rights of data subjects. Articles 12 through 23 enumerate the rights of the data subject. No summary is adequate to convey all of the particular rights, and for that reason it is incumbent on all data controllers and processors to read, understand, and abide by practices which uphold these rights. But for the purposes of this primer, we can think of these rights are granting persons the following powers and assurances.

Powers

  • To provide only those elements of personal data they agree to provide, having been fully informed of the purposes and risks of the data collection
  • To provide such data only to those entities he or she wishes
  • To rectify or request the erasure of their personal data
  • To access the personal data collected from them in a readable and standardized format (note that this does not necessarily mean in a native spoken language)
  • To contest significant decisions affecting them (e.g., those of employment or legal action) that are computed solely by an algorithm operating on their personal data
  • To seek legal redress for the failure of a data controller or data processor to respect these powers or to maintain the following assurances

Assurances

  • The data subject shall not be identifiable from the personal data, through use of secure pseudonymization protocols (e.g. assigning an alphanumeric identifier to either a data subject and/or an element of their personal data, from which publicly identifying information such as the subject’s name, NHS number, address, or birthday cannot be deduced)
  • The data subject will be immediately informed of any breach of their data privacy
  • The data subject’s personal data shall be consulted and processed only for those those purposes disclosed to the data subject as part of obtaining his or her informed consent
  • Data controllers shall request from the data subject only those elements personal data that are essential to the purposes made explicit during the process of informed consent (i.e. data minimization)

What duties do these powers and assurances incur for data controllers and processors? The concept of “data protection by design and default” is useful, if general, place to start. Before data collection begins, data controllers and processors must establish and document systems and practices that:

  • make it clear to the data subject which elements of their personal data the controller or processor is requesting, and for what purposes
  • make it clear to the data subject which persons or entities will have access to their data
  • maintain the privacy of personal data, e.g., through pseudonymization, data encryption, physical security, etc.
  • prevent the collection of data that is immaterial to the purpose of data gathering

Which sorts of systems and practices qualify as achieving those aims? The answer the regulation gives is, unfortunately, something of a placeholder. Article 42 offers data controllers and processors the opportunity to “certify” their data protection mechanisms, but the certification bodies and requirements for certification are all unspecified. (Paragraph 3 even states that “the certification shall be voluntary.”)

For better or worse, data controllers and processors seem to bear the burden of selecting – and justifying to unspecified “certification bodies” – the technical criteria by which the GDPR will assess their data protection measures.

This is perhaps both a problem and an opportunity. Better that minimum encryption standards, for instance, go unspecified (for now) than be made to conform to some arbitrary decree. As data controllers and processors, we can take an active role in establishing these and other criteria in a way serves data protection and efficient data flow.

When does the regulation go into effect?

The regulation becomes effective on Friday, May 25th, 2018. This is a universal implementation date: national governments do not have authority to legislate a later (or earlier) effective date.

Who is in charge of enforcing it?

The European Parliament, the Council of the European Union and the European Commission are the governing bodies with respect to the EU GDPR.

What are the penalties for non-compliance?

If you’re looking for concreteness among all the abstraction of the GDPR, look no further than Article 83, “General conditions for imposing administrative fines.” All nine paragraphs set forth factors in arriving at a sanction for negligence or willful violation. But paragraph 6 will continue to attract the most attention: “(6) Non-compliance […] shall […] be subject to administrative fines up to 20 000 000 EUR, or in the case of an undertaking, up to 4 % of the total worldwide annual turnover of the preceding financial year, whichever is higher.”

Is that all there is to GDPR?  

Unfortunately no. If regulatory compliance were as easy as nodding along to a blog post, we’d never hear of violations. Then again, we’d hear about a far more, and more severe, privacy breaches. Remaining compliant with all of the regulations than bear on clinical research may be a logistic burden, but it’s the right thing to do. You wouldn’t knowingly expose a patient (or their identity) to harm, but that’s what non-compliance amounts to: thousands of seemingly minor risks that make at least one catastrophe almost inevitable. So continue to educate yourself. We’ll help in that effort with a series of blog posts that begins with this one. And if the moral imperative of compliance doesn’t motivate you, consider the impact that non-compliance could have on your business or organization. You really don’t want to step in this natural fertilizer.

OC17 is just 32 days away! Here’s the detailed program.

If you’ve been waiting for more detailed session information to register, wait no more. You’ll find all the session and workshop abstracts, with speaker bios, on our OC17 resource page.

See the OC17 session abstracts

OC17 is also your chance to see the new OpenClinica up close. Take a “deep dive” into the all-new study build system and form engine.


See more of the new OpenClinica

 

Do not hesitate to email bfarrow@openclinica.com with any questions regarding OC17.

The OC17 Program is here, and it’s bursting at the seams

Click here to learn more and register!

The OpenClinica community has proven once again its passion for innovation and knowledge sharing. We received a record number of abstracts exemplifying the OC17 theme, Making the Complex Simple. Best of all, the diversity of expertise among our community yielded a set of sessions that covers a wide swath of data management challenges and solutions. To claim that OC17 “has it all” would be a cliché (and, given the breadth of our field, an obvious exaggeration). But with sessions on pseudonymization, patient registries, biobanking, medical imaging integration, and more, this year’s conference will deliver case studies and how-to’s that almost certainly bear on your research. So let me an offer another, more defensible, another cliché: OC17 has something–and more likely two or three things–for everyone.

Below you’ll find the program as of today, September 26th. (Order is subject to change.)

Monday, December 4: Track 1

Plenary session, “The Story of OC4” | Cal Collins, CEO – OpenClinica

OC4’s ultra-capable forms | Ben Baumann, COO – OpenClinica

Multi-system subject tracking, screening automation, and data exportation | Patrick Murphy – RTI

50,000 subjects, 15 countries, and 1 (multilingual) OpenClinica study | Gerben Rienk Visser – TrialDataSolutions

OC4 architecture | Krikor Krumlian, CTO – OpenClinica

Risk-adapted strategies using OC in an observational trial with a marketed product | Edgar Smeets, PhD, CCRA – Smeets Independent Consultant – SIC

A Plan for Getting Started with Risk-Based Monitoring | Artem Adrianov, PhD, CEO – Cyntegrity

Monday, December 4, Track 2

Plenary session, “The Story of OC4” | Cal Collins, CEO – OpenClinica

Essential reports for the CRC, data manager, monitor and sponsor | Bryan Farrow, eClinical Catalyst – OpenClinica

EUPID services for pseudonymization and privacy-preserving record linkage in OpenClinica | Markus Falgenhauer – Austrian Institute of Technology (AIT)

MIMS-OC – A medical image management system for OpenClinica | Martin Kropf – Charité Berlin

Working with OC Insight – OpenClinica’s Configurable Data Visualization Dashboard | Lindsay Stevens, CTDM Specialist – OpenClinica

The RadPlanBio approach to building biomedical research platforms around OpenClinica | Tomas Skripcak – German Cancer Consortium

How to Implement SDTM in Your Study | Mark Wheeldon, CEO – Formedix

Tuesday, December 5, Track 1

Keynote address | Dr. Andrew Bastawrous – Peek Vision

Late-breaking sponsor session | Announcement forthcoming

eConsent as a validated application of OpenClinica’s Participate forms | Brittney Stark, Project Manager – OpenClinica

WORKSHOP: Designing, Publishing, and Sharing Your Study in OC4 | Laura Keita, Director of Compliance and Training – OpenClinica

WORKSHOP: Inside OpenClinica’s APIs | Krikor Krumlian, CTO – OpenClinica

Tuesday, December 5, Track 2

Keynote address | Dr. Andrew Bastawrous – Peek Vision

Combining a nationwide prospective patient registry and multiple RCTs with OpenClinica | Nora Hallensleben and Remie Bolte – Dutch Pancreatitis Study Group

Open Conversation: A dialogue between OpenClinica and OpenSpecimen | Srikanth Adiga, CEO – OpenSpecimen; Aurélien Macé – Data Manager, FIND Diagnostics

WORKSHOP: A Tour of OpenClinica Modules and Integrations | Mike Sullivan, Senior Account Executive – OpenClinica

WORKSHOP: Making the Jump from OC3 to OC4 | Iryna Galchuk, Solutions Engineer – OpenClinica

Details are still in the works, but we will host a “cocktails, conversations, and demos” reception on Monday evening, and cruise Amsterdam’s canals on Tuesday evening.

We hope you can join us for a pair of memorable days, professionally and culturally. Learn more and register here!

 

The Change Business

You’re in the change business.

Is that what you thought of as you arrived at work today? It’s true. Working in clinical research, you bring positive change to the world, through the discovery, testing, and dissemination of therapies that improve people’s health.

No doubt your role (and mine) is a lot more specific and focused than that. It has to be, because clinical research is all about the details. To achieve the big changes, we need to implement, control, and communicate many other, smaller changes on a constant basis.

Sometimes, change is initiated from within. Sometimes, it’s imposed from outside. And at times, the whole context shifts. This last kind of change has dominated research for the past few years. Mobile technology, patient-centricity, healthcare upheavals, economic pressures, real-time monitoring, and genomic medicine are changing the context of how we approach research, and the nature of what we’re trying to accomplish.

Responding to this type of change requires clarity, purpose, and vision. A decade ago, a few of us started the OpenClinica project to inject a sorely needed dose of flexibility and accessibility into the clinical trials technology landscape. Now, we’re working to make it easy to adapt to the complexities of the new research ecosystem, while continuing to prioritize our original principles of flexibility and accessibility.

The biggest changes to OpenClinica in years are coming next month. As I’ll illustrate in my next post, closer to release, the new OpenClinica is designed to be both easy and powerful. It’s fast to adopt, simple to learn, and a joy to use, whether you’re a data manager, CRA, investigator, or study subject.

One thing that will never change is our commitment to the success of our customers and our community.

Get Your Queries Under Control (Part 1: Query Aging)

Getting to “no further questions” can seem like an endless task

We may all face death and taxes, as Ben Franklin quipped, but data managers are most certainly guaranteed yet a third inconvenience: queries. As long as human beings play a role in data capture, some fraction of this data will defy either explicit validation rules, common sense, or both. These entries, when left unresolved, not only create more work for monitors and coordinators, but they delay interim and final locks. And time added to a trial is cost added to a trial.

As dispiriting as this sounds, data managers employing EDC enjoy tremendous advantages in “query control” over their paper-using peers. Paper doesn’t support real-time edit checks, our most effective vaccine in the primary prevention of queries. Neither does paper allow for real-time reporting on the status of our queries. That reporting is crucial in any effort to close existing queries, reduce the number of queries triggered in the future, and shorten the amount of time queries remain open.

By itself, a long list of open queries won’t signal alarming query trends. For that, metrics are required. In a series of posts starting with this one, I’d like to offer guidance on those metrics.

Metric #1: Query Aging

You are a data manager in a study of adults with diabetes. On Monday, a coordinator at one of your sites submits a form indicating a patient weighed in at 15 lbs. Imagine that a query was opened that same day, whether automatically by the EDC system or manually through your own vigilance. The clock is now ticking: how long will it take for the coordinator to correct the data point?

In actual studies, there are dozens or hundreds or even thousands of such clocks ticking away. Some queries may be just hours old, while others could be months old (help!). Eventually, all of them will need to get to close in order for database lock to occur. The key to locking on time is to effectively manage, from the very start, both the overall number of open queries and the proportion of open queries that are old.

Why does age matter?

The further an open query recedes into the past, the more difficult it becomes to resolve, as new patients and new tasks push the issue further down on the coordinator’s to do list. Clinicians involved in the original assessment may depart or find their memories of the relevant event fading. We let queries linger at our own risk.

Long, complex studies enrolling hundreds of patients can easily generate thousands of queries. While each open query will have its own origin date, defining age brackets can help you characterize the distribution of their ages. Age brackets are ranges set by a lower limit and upper limit, most commonly expressed in days; 0 to 7 days is one example of a bracket, 8 to 14 days is another. Age brackets must cover, without any overlaps, all possible ages of an open query. Therefore, the first bracket always begins with 0 days open, to accommodate the newest queries, while the last bracket will specify no upper limit at all.

Many EDC systems default to a specific set of brackets (usually 4 or 5) in grouping a study’s open queries. For example, an EDC might group all open queries generated less than 8 days ago into the first bracket, and all open queries generated between 8 and 14 days ago into the second bracket. But these dividing lines do not represent universal constants. You are free to set your own lower and upper limits on your brackets, and you should do so based on the event schedule of your study.

Who are you calling “old”?!

What makes study-specific bracket delimitation a best practice? When it comes to open queries, young and old aren’t just relative to one another, but to the amount of time between consecutive visits. If a query from a patient’s first visit isn’t resolved by her second visit, queries generated by that second visit will push the earlier one (psychologically, administratively) into more precarious territory. By way of analogy, you are much more likely to give prompt, accurate answers to questions about your commute from this morning than to questions about your commute yesterday.

You will never completely prevent query “backlogs” from arising in your studies. But you can put in place one preventative measure based on the idea above. First, identify the shortest span of time between any two consecutive visits in your study. (You can dispense with any outliers; e.g. visits that are just two days apart.) Call that shortest span n days. Now, set your query brackets as follows:

  • First (“youngest”) bracket: 0 to n-1 days old
  • Second bracket: n to 2n-2 days old
  • Third bracket: 2n-1 to 3n-3 days old
  • Fourth bracket: 3n-2 to 4n-4 days old
  • Fifth (“oldest”) bracket: 4n-3 days old or older

Suppose, for example, that the shortest interval between visits in your study is 14 days. You would then structure your brackets as follows:

  • First (“youngest”) bracket: 0 to 13 days old
  • Second bracket: 14 to 26 days old
  • Third bracket: 27 to 39 days old
  • Fourth bracket: 40 to 52 days old
  • Fifth (“oldest”) bracket: 53 days old or older

Now that you’ve set your brackets, announce them to your colleagues and your sites. Let them know that effective data management for your study means keeping the proportion of open queries in each of these brackets steady:

  • First bracket: at least 35% of your open queries
  • Second bracket: no more than 30% of your open queries
  • Third bracket: no more than 20% of open queries
  • Fourth bracket: no more than 10% of open queries
  • Fifth bracket: no more than 5% of open queries

By holding yourself and sites accountable to keeping roughly 2/3rds of your open queries in the first two brackets, you limit the number of queries falling into older brackets, where resolution becomes more difficult due to the passage of time.

Enforcing the policy

There’s a word to describe a goal that doesn’t have a reward tied to its achievement, or a sanction to its failure: a hope. Below are some methods for igniting a sense of urgency in your sites with open queries.

  • Make a pact. A coordinator will not feel obliged to answer within 3 days a query that it took your team 3 weeks to raise. Your data management team should commit to opening queries within some set period of time following initial data entry. Your discipline will inspire the site’s discipline.
  • Do some good. While you should always check with compliance officers first, consider making a donation to a patient advocacy group based on a study-wide reduction in open queries. For example, you may announce at the start of the month that each percentage point reduction in open queries within brackets 3, 4, and 5 by the end of the month will translate to 10 additional philanthropic dollars, made by the sponsor organization in the name of all participating sites.
  • Recognize the “always prompt” and “much improved” sites. On the last day of each month, email to all your coordinators a query status report. Indicate how many queries are open within each bracket. Mention by name and congratulate:
    • sites with the fewest number of queries per data field completed (limited to sites that have supplied data for some minimum number of fields),
    • sites with the shortest average time to close for all queries opened in the prior month (limited to sites that have had some minimum number of queries opened), and
    • sites that have closed the highest percentage of their open queries since the beginning of the month.

In this way, you are recognizing the continuously conscientious sites along with those striving to improve.

Staying on Track

Effective data management requires discipline from you, your monitors, and your sites, but the effort is well worth it. Interim and final locks that occur on time save all trial stakeholders from a lot of anxiety and frustration. A single open query may not seem like an impediment to clean study data overall, and indeed they are inevitable and understandable. But every day a query remains open contributes to a delay that’s felt most acutely by the patients you and your sites are working so hard to help.

Register for OC17 today and secure early bird pricing!

OC17 register today

We are thrilled to announce the venue and keynote speaker for OC17, our 9th Annual Global Conference, and to open registration for the event.

OC17 Theme and Dates
“Making the Complex Simple”
December 4 – 5 | Sessions and Workshops
December 6 – 8 | Super User Training

Venue
Mövenpick Hotel Amsterdam
Piet Heinkade 11
1019 BR  Amsterdam
Netherlands

Keynote Speaker
Dr. Andrew Bastawrous
Co-Founder and CEO
Peek Vision

Further details, pricing, and a registration form are just a click away. Reserve your spot now to lock in early bird pricing.This year’s theme is “Making the Complex Simple,” and we are proud to offer a space and speaker exemplifying that theme. The Mövenpick is a short taxi ride from the international airport in Amsterdam. We are confident that your stay there will delight with its simple elegance. We are especially honored to welcome Dr. Bastawrous as our keynote speaker. His story on the impact that innovation can have must be seen to be believed.

 

 

We hope you can join us in December as we turn the spotlight on our incredible user community once again. Do not hesitate to email bfarrow@openclinica.com with any questions regarding OC17.

Sincerely,
Ben Baumann, COO

14 Best Practices for ePRO Form Design

Is there any term in data collection more despairing than “abandonment rate”? That’s the percentage of respondents who start inputting data into a form but don’t complete the task. Sometimes, it’s hard not to take that kind of rejection personally.

Sadly, it’s just one problem among dozens that hinder the collection of timely, clean and complete data directly from study subjects. Data managers face challenges from the start. A participant’s compliance with study procedures (and data entry) is always only voluntary, and apart from the occasional stipend, these participants rarely receive compensation in dollars. That’s not to say that the care provided in the course of a study isn’t of value to patients. But that “quid pro quo” mindset isn’t easy to maintain outside the clinical site. Paper diaries and even electronic forms ask a participant for their time and attention, commodities that are in short supply these days. As an industry, we can’t stop looking for ways to minimize the burden on participants. Not if we want to maximize their contribution.

In previous posts, we explored BYOD as a preferred approach to collecting ePRO data. But that’s only half the story. What good are friendly, motivational messages to a participant’s mobile device and computer, if the form to which they’re directed is needlessly long or convoluted? That’s a recipe for heartbreak (form abandonment), or at least a host of trust issues (incomplete or poor quality data).

So, what are the keys to getting your participants to enter all their PRO data, accurately and on time? Not surprisingly, they’re not too different from those we rely on in any budding relationship. Below, I bundle them into four categories.

Make a good first impression

Imagine yourself as a job interviewee. The hiring manager asks you to rattle off the top three responsibilities in every role you’ve ever filled. How coherent will you answer be? A practiced interviewer, interested in getting to know you, would start differently. “Tell me, what’s your top priority in your current role?” She’ll need to gather a lot more information before the end of interview, but she’s set out a comfortable pace for getting it.

The lesson for ePRO form design is clear.

1. Start with a single (or very small number) of questions that can be answered simply. Present these one to three questions, and no more, on the first page.

This best practice is one example of a broader rule: always proceed from the simple to the complex. Don’t ask the participant to choose one of sixteen options. Rather, ask two yes or or no questions that will reduce the options first to eight and then four, and have the user pick from that list. Yes, the latter scenario involves more clicks, but it involves less cognitive labor and less scrolling.

Dress to impress

The “e” in “ePRO” isn’t usually capitalized, but maybe it ought to be. Leave the paper mindset behind. Spatial and color constraints no longer apply, and you can present information in a way that prevents the participant from looking ahead (and becoming discouraged). In short, you’re free to give respondents a form-reading and form-filling experience they actually enjoy. Here are some pointers on visual cues that work with the eyes and minds of your participants:

2. Black or dark grey text on a white background is always the safest default for instructions and form labels. For section headings and buttons, a vibrant color that contrasts sufficiently with the background can help orient the respondent.

3. If you have a choice of fonts, sans serif is preferable. Why? While studies do not point to a clear winner for readability in all contexts, evidence suggests that lengthy text passages that ask the reader to parse several ideas at once are best served by serif fonts, while short prompts and labels are best conveyed with a sans serif font. (And you already know that short and direct is best, right?) The wide variety of serifs in existence make character-recognition more difficult to non-native readers, while san serifs are more likely to remain readable when scaled down.

4. Place the field label above, and left justified with, any field requiring the respondent to type. This “stacked” format suits the portrait orientation with which most smartphone users hold their device. Placing the field label inside the field for the user to type over may save space, but it also causes your label to disappear once the user interacts with the field.

5. Avoid grids. There are contexts where a grid layout is preferable; for example, when recording multiple instances of body position, systolic pressure and diastolic pressure for a patient undergoing blood pressure tests. But these are almost always in-clinic scenarios. For the collection of ePRO data, stay within a single column layout.

6. Paginate. One screen containing six questions, or two screens containing three each? All things being equal, fewer questions on more pages is preferable. Why? Clicking next is easier than scrolling. Also, breaking up a large set of questions into two smaller ones reduces the intimidation factor.

7. Use images as selection options when text could lead to misinterpretation. Not all form engines support this feature, but the ability to present an image in place of (or with) a checkbox or radio button is more than a “nice to have.” Which of the following is more likely to your quality, computable data?

 

or…

 

 

I’ve relied on a few best practices in this example, starting with some very simple questions in order to narrow down the remaining ones and present them in a manner that gives us clean, accurate, and computable data. But the images in particular are what rescue the participant from needless typing and potential confusion. By making the regions within the head illustration clickable, I can capture very discrete data without taxing the participant’s time.

Respect their time

“Press C to confirm or X to cancel.” That’s a familiar formula to anyone who’s received a text message before an appointment. These are easy to appreciate. If I feel I owe any response to the sender, I’m more likely to complete this pseudo-form than any other.

Chances are, however, you may need a little more data than this from your participants. Here’s how you can gather it while respecting your participant’s time.

8. Minimize the number of fields. This advice may seem obvious and simple, but it’s neither. So long as a participant’s date of birth has been documented once, having them input age is never necessary. And if your system is capturing metadata precisely (e.g. a date and time stamp for every change to a field’s content), then you don’t need to ask the participant to record this information. In general, before adding a field, it is helpful to ask:

  • Do I really need this information? If yes, then…
  • Can I reliably deduce or calculate it based on prior data? If no, then…
  • Do I need the participant to supply it? If (and only if) yes, then include the field.

9. Use skip logic. The phrase “if applicable” should never appear in a form, especially forms designed for participant use. If you are asking the question, it had better be applicable. You can ensure that it is by using skip logic. Has the participant eaten that day? Only inquire about nutritional content for that day’s meals if she responds with “yes”.

10. Use branching logic. Branching logic can help a participant isolate the response she wishes to provide by a process of elimination. Suppose you need a participant to input a cancer diagnosis she had received. Given the wide variation in health literacy, we can’t assume she recalls the formal diagnosis. It may be more helpful to solicit this information through a series of questions easier to answer. Did the cancer involve a solid mass? Depending on the participant’s response, the following question might pertain to an organ system (if “yes” is selected) or blood cells (if “no” is selected). Just five yes or no questions can narrow a field of 32 options down to one.

Doesn’t the use of branching logic conflict with strategy of minimizing the number of fields? These are guidelines, not rules, so trade-offs are sometimes necessary. A drop-down menu showing 32 option may represent just one field, but scrolling through that many selections (not all of which will be visible at once) places an enormous hurdle in front of the participant. The mental effort and time spent scrolling on that one field far outweighs any time savings that might be secure by eliminating three fields. Meanwhile, you’ll have frustrated the participant.

11. Use autocomplete. There’s another way of solving the history of cancer problem above. As long as participant can recall any portion of the diagnosis when spelled out, autocomplete can help them retrieve the full term. The best instances of autocomplete reveal all matches as the participant types, so that upon typing “Lym” a choice is immediate presented among:

Acute Lymphoblastic Leukemia (ALL)
Central Nervous System Lymphoma
Hodgkin Lymphoma
etc.

The ubiquity of predictive search among search engines like Google makes autocomplete a familiar experience for your participants.

In an era where summoning a taxi or making a credit card payment is a 20-second tasks, participants will not (and should not) tolerate inefficiency on the web. You are competing with their other online experiences, not just traditional, paper forms. The good news is that you can delight your participants by showing them that even contributing to medical research can be as easy as their navigating their favorite web pages.

Show appreciation

You’ve read 85% of this post! Knowing that doesn’t guarantee you’ll read to the end, but it does make it more likely. Regular, responsive feedback is a powerful spur to action. Here are three ways to interact positively with your participant throughout (and even after) the form-filling process.

12. Convey their progress as the participant completes the form. Reflecting back to the participant the portion of the form they have completed and the portion that they have remaining serves two functions. The first is informative. You’ve anticipated the participant’s question (“how much longer?”) and answered it proactively. The second is motivational. Completing even simple tasks triggers the release of dopamine in our brains. We get a neurochemical rush every time we drop a letter in the mailbox or hit send on an email. 

You can reward your participant throughout the form-filling process by incorporating a dynamic progress bar into your ePRO form. Every page advance is an opportunity to “dose” your participant, visually, with a bit of happiness.

13. Autosave. Batteries die. Smartphones drop to the floor. Thumbs twitch and close screens. None of these scenarios is justification for losing a participant’s work. Your system should capture input on a field-by-field basis; that is, a back-end process should save the participant’s input into a field the moment he or she leaves that field for another. If a participant abandons a form and then returns to it, he or she should be able to resume where they left off. If you can signal back to the participant that their input has been saved with each field transition, all the better, as this leverages the same psychological power as the progress bar.  

14. Show gratitude. Imagine a campaign staffer asking you a series of questions about your views over the phone. You answer the last question and he or she hangs up without so much as a goodbye. Chances are, they’ve lost your vote on account of rudeness alone.

Don’t let this happen to your participants. When they submit a completed form online, they should immediately receive a “thank you” message that is specific to the task they have just completed.

Ensuring the optimal experience for participants supplying ePRO data is more than courtesy: it’s a critical measure for maximizing data quality, quantity and timeliness. So commit to dazzling the people who matter most in your research. Because, as in all relationships, you get back what you give. Click here to learn more about participant friendly ePRO from OpenClinica.

 

Resources

https://www.formassembly.com/blog/web-form-design/
https://www.ventureharbour.com/form-design-best-practices/
https://www.nngroup.com/articles/web-form-design/
https://medium.theuxblog.com/10-best-practices-for-designing-user-friendly-forms-fa0ba7c3e01f

Save the Date! OC17, December 4th and 5th, in Amsterdam

OC17, OpenClinica’s 9th Annual Global Conference, will take place in Amsterdam, on December 4th and 5th this year. This year’s theme? “Making the Complex Simple”

We will offer in-person Super User training from December 6th through the 8th.

Exact venue, times, pricing, and official call for presentations all coming soon. In the meantime, please let us know if you’re interested in attending!