By Ed Miseta, Chief Editor, Clinical Leader
Pharmaceutical companies, CROs, and software/application vendors with operations in the European Union (EU) may be at serious risk from the new EU General Data Protection Regulation (GDPR). This includes companies that are already compliant with HIPAA (the Health Insurance Portability and Accountability Act). Violations of the regulation, which is set to go into effect in 2018, can result in penalties of €20 million or 4 percent of worldwide revenue.
“I believe this is a critical topic for the pharma industry,” says Susan Shelby, Sr. VP of clinical operations for Biomedical Systems, a clinical research organization. “It will have a significant impact on the industry. I’m convinced companies are not prepared for it, the penalties are steep, and it doesn’t seem enough people are discussing it.”
There was only one session dealing with the topic at DIA’s Annual Meeting in June, where I had the opportunity to speak with Shelby and Peter Alterman, COO of the SAFE-BioPharma Association. The lack of coverage was a concern for both, considering the confusion that seems to exist and the initiation date being less than 11 months away.
Replacing The Data Protection Directive
After four years of preparation, the GDPR was approved by the EU Parliament in April 2016. The heavy fines will be imposed on those companies not in compliance with the regulation on or before the enforcement date of May 25, 2018. The GDPR replaces the Data Protection Directive 95/46/EC and was designed to harmonize data privacy laws across Europe. The goal is to protect EU citizens from privacy and data breaches, as well as reshape the way organizations across the region approach data privacy.
According to GDPR, there are three key changes under the regulation. They are:
1. Increased Territorial Scope
GDPR applies to all companies processing the personal data of subjects residing in the EU, regardless of the company’s location. Previously, territorial applicability of the directive was ambiguous and referred to data process “in context of an establishment”, a topic that has arisen in a number of high profile court cases. GDPR makes the applicability clear by noting the rules apply to the processing of personal data by controllers and processors in the EU, regardless of whether the processing takes place in the EU or not.
2. Increased Penalties
As noted earlier, organizations in breach of GDPR can be fined up to 4 percent of annual global revenue or €20 million, whichever is greater. This is the maximum fine that can be imposed for the most serious infringements, such as not having sufficient customer consent to process data.
3. Improved Patient Consent
The conditions for consent have been strengthened, and companies will no longer be able to use long, illegible terms and conditions full of legal jargon. Under GDPR, the request for consent must be given in an intelligible and easily accessible form, with the purpose of data processing attached to that consent. Consent must be clear and distinguishable from other matters and provided using clear and plain language. Additionally, GDPR stipulates that it must be as easy to withdraw consent as it is to give it.
Reason For Concern
Considering the language contained in the regulation, as well as the penalties for non-compliance, it’s understandable why Shelby believes the industry should be concerned. The criteria for clinical data de-identification and anonymization have been described and well-reviewed for simple cases, such as data collection using an eCRF (electronic case report form). CROs can automatically de-identify the patients’ data at the site, offset dates of birth, and redact sensitive data.
Of genuine concern, however, is the transmission of so called “non-CRF data.” These are the data from EEGs (electroencephalogram), pathology slides, ECGs (electrocardiograms), echocardiography data, imaging data, and specialty laboratory data. These data typically are not directly input into the eCRF, but are sent out for expert analysis. They contain sensitive data necessary to the scientific value of the study. Compliance in these situations will involve hospitals, medical device companies, and software vendors identifying methods. Therefore, simple redaction of data may not always be the correct solution.
Many hospitals, medical device companies, and software vendors are not in compliance, and the time available to take corrective action is running out.
“Data integrity and data quality are critical issues for any company performing clinical research,” says Shelby. “But at the same time this industry is global in reach and there are no boundaries as far as where data is collected, processed, and transmitted prior to being integrated into a global clinical trial database. Data goes everywhere and with the technology that exists today, it also moves very quickly.”
Moving data has always been a complicated process for Biomedical Systems. Shelby notes data could be collected in France, Germany, or other EU nations, and she was always aware of the local laws in those countries. The data was then transmitted to the company’s Belgian headquarters where it was processed. It was then transmitted to its U.S. headquarters where the data set was assembled and eventually submitted to a sponsor company.
With GDPR, Shelby wonders how companies operating in the EU will continue to conduct trials. Although the regulation is being discussed by consortiums such as CTTI (Clinical Trials Transformation Initiative) and TransCelerate, Shelby believes the companies impacted by it are struggling to understand the requirements. Having worked for over 20 years on the sponsor side of the business, she knows that when designing clinical trials, personnel are aware of what scientific data they need to collect. She sees new laws like this flying in the face of what highly specialized personnel are required to do with the data.
Privacy And Automation Create Problems
Shelby states that in designing clinical trials and their endpoints, researchers are aware of what data needs to be collected. She believes GDPR will make it difficult for companies to get the required data and to share it holistically with the researchers who need to interpret it.
“This new regulation is not simply about an inappropriate release of data, such as a hack or a breach,” explains Alterman. “It’s also about protection for expected and programmed releases of data, such as companies using data in ways that the patient did not realize it could be used.”
Alterman believes a big part of this problem is cultural in nature. In Europe, all personal information is viewed as belonging to the patient, and data privacy is considered a human right. Therefore, patients have the right to control their data. For that reason, patients have to opt into the data collection process; they do not need to opt out of it. But GDPR takes the issue a step further, stating patients also have the right to take that data away from companies by having it deleted.
Therein lies a major issue. According to GCP (Good Clinical Practice) and federal regulations, companies are required to have audit trails. They are also not permitted to delete data they have collected. GDPR does not address what will happen when a company is pinned between federal regulations on one side, and patients wanting to erase their data on the other. “This new regulation will put many companies in conflict with other laws,” states Shelby. “Companies are required to keep a complete audit trail, but now must also delete data from those patients who have a right to be forgotten.”
Culture is not the only issue. Automation is complicating the problem. At one time, a trial master file (TMF) would sit on a shelf in an administrator’s office. It now exists electronically as an eTMF on a network.
“When information existed on a piece of paper, it was easily controllable,” says Alterman. “When information of any kind is in a data file, especially in a system connected to the Internet, you don’t have that same level of control. Backup copies could reside with statisticians located in other countries. Although the data is controlled, it is also decentralized. Researchers generally do not reside in the same country where the trial was conducted. This creates a quandary: Once that data is used and backed up in different locations, pulling it back from every location could be an impossible task.”
Trusted Identity Enters The Picture
When systems are connected and exchanging information, the only way to know who is involved in that transaction between an individual and the data is via the use of credentials (verifying the identity of users through passwords and other identifying information).
“When you have personally identifiable data subject to regulatory control, it is absolutely necessary to know who is logging into that information and creating copies of it,” states Alterman. “The only way you can do that is by setting up your system in such a way that it requires high assurance that the person logging in is who the credential says he or she is. Governments in the U.S. and Europe have set up a number of complex systems for certifying the issuers of credentials. "
Today, proving the identity of someone requires gathering a large amount of personal information. That could include taking their photo, recording their thumbprint, performing an iris scan, recording a driver license or passport number, and verifying personal information such as age and address. “It’s all about a trust information structure that starts with the credential issuer and ends with the application relying on the credentials,” states Alterman.
Where Do We Go From Here?
While the protection of data via credentials is important, it will not solve all of the problems that might result from the implementation of GDPR. Shelby notes conducting a clinical trial will always require certain individuals to have access to patient information in a timely fashion. “Going back to paper is not an option, and even if it was, it would not solve the problem of adhering to the requirements of GDPR,” she says. “A patient having the right to review and change their data is simply not acceptable in our current clinical setting.”
Is there a solution? Shelby believes we first need to have a lot more discussions that start with the industry think-tanks and include participation by the FDA, EMA, EU, and stakeholders such as sponsor companies, sites, CROs, and hardware/software vendors.
The first step should be clarifying what it means to have de-identified information. If a test is such that it identifies a patient (e.g. an ECG), then there is really no way to erase that information. It’s possible to remove the patient’s initials, a process known as pseudo-anonymization. But removing other information such as gender or date of birth will impact trial results. For example, the age of a patient is critical in trials involving patients who are growing rapidly. The period from date of birth to the date of a test is how age is most accurately determined.
“The only way we will solve a problem like this is by allowing full disclosure of data among parties that have been vetted and deemed to be allowed to see and use it,” states Alterman. “That is the most logical approach. Right now all we have is a lot of confusion. The awareness of this regulation needs to be ramped up by orders of magnitude. The bureaucrats, lawyers, and regulators have to understand the implications of what their regulation will do to the conduct of clinical trials.”
“Eventually, someone has to explain what’s ok and what isn’t,” insists Shelby. “If we do not take that first step, eventually we will have non-attorneys trying to interpret the regulation. Those folks, in an attempt to comply and avoid fines, may institute procedures that are overly strict and impact scientific results. People who will have to live by this regulation need to know what is required. One solution would be to delay the implementation of the regulation for at least a year. There are issues that have to be ironed out, and I see this discussion going on for a long time.”