Massive Data, More Decentralization Drive Updates To ISPE GAMP Guide
A conversation with Frank Henrichmann, senior executive consultant, Q-FINITY Quality Management, and a lead author of the ISPE GAMP Good Practice Guide: Validation and Compliance of Computerized GCP Systems and Data – Good eClinical Practice (Second Edition)
As digital transformation is rapidly progressing and paper-based processes are becoming rare in clinical research, the International Society for Pharmaceutical Engineering (ISPE) has recognized the need for an updated guide that discusses and makes recommendations for risk-based systems validations related to Good Clinical Practice (GCP).
Lead author Frank Henrichmann of “ISPE GAMP Good Practice Guide: Validation and Compliance of Computerized GCP Systems and Data – Good eClinical Practice (Second Edition)” describes the changing clinical data environment and details updates that will support the work of various clinical trials stakeholders.
The second edition of the “GAMP Good Practice Guide: Computerized GCP Systems & Data” has been published. Why is now the right time for an update?
Frank Henrichmann: The way we design, plan, and conduct clinical trials today differs from how we did it 15, 10, or even five years ago. The pharmaceutical products developed through clinical trials have changed significantly and now include more innovative cell and gene therapies (CGTs). This requires dramatically different designs of the clinical trials, and the tools we use need to adequately support these novel designs and address new requirements.
The challenges faced during the COVID-19 pandemic required the industry to rethink its data collection and data management practices. Pandemic lockdowns limited the ability of study participants to visit clinical sites for treatments and examinations, resulting in accelerated decentralized collection of data via participant homes, mobile clinics, wearables, sensors, and telemedicine. This created additional data integrity and data management challenges. While in traditional setups participants were typically an anonymous number, in these new setups personal data is collected and processed to ensure the delivery of equipment or investigational products and to enable telemedicine. These are just a few examples of what has changed in the past years since we published the first edition of our guide. Of course, the regulatory authorities also reacted to these changes and issued guidance to address these new elements and challenges.
What are some of the new themes or topics covered in the second edition?
The extensive new guide covers more topics than the first edition. While the structure of the first edition remains the same, the content has been expanded to cover topics like decentralized trials, the evolution of data management into data science using AI solutions, and guidance for the generation of real-world evidence (RWE) from real-world data (RWD) that are available from electronic health record (EHR) systems, patient registries, and other unregulated data sources.
Additionally, the guide now covers oversight activities like audits and assessments and includes hands-on guidance and questions to consider when assessing computer systems used at a clinical site. It also provides guidance on data privacy in the context of clinical trials. And while validation of the AI/ML-enabled systems is covered in other ISPE guidance, considerations for AI-enabled systems used in clinical studies are specific to this comprehensive guide.
Today, more users are being brought into the fold of clinical trial processes — from pharma companies, CROs, technology service providers, and clinical research sites to trial participants. How does this new edition incorporate and address this growing body of end users?
With more parties interacting with the systems and the data during the collection and processing of clinical trial data, increased challenges with data integrity arise. For example, decentralized trials (DCTs) often use sensors and wearables to capture data directly from the participant. These data need to be transferred securely and accurately into the systems that manage the data of all trial participants, e.g., an EDC system.
Because these sensors and wearables are taking frequent readings, the volume of data being collected has grown exponentially. For example, in a traditional trial setup, glucose levels or an ECG may have been taken only during visits to the clinical site. With wearables and sensors, these readings may be taken with great accuracy on a much more frequent, almost continuous basis. The massive increase in data volume has transformed traditional data management activities into more of a data science activity where specialized tools that so far primarily used in the analysis of Big Data are now used to analyze the study data to look for patterns or inconsistencies. Today, the industry is rapidly adopting AI solutions to manage and analyze these large data volumes faster so that the safety and well-being of the patient can be better protected, and critical decisions can be made faster on reliable and trustworthy data.
Clinical data often also originates from systems owned and operated at a clinical site. This may include instruments and equipment that are used in the daily care of patients as well as EHR systems in hospitals. If data that have been generated or processed by these systems are to be used within a clinical trial the adequacy of these systems must be verified up front by the sponsor. This expectation creates some challenges considering that clinical trials often include hundreds of sites around the globe. Efficient yet reliable methods and processes need to be established to assure necessary control and oversight while at the same time not overburdening the limited resources at clinical sites. The guide provides hands-on guidance and a list of aspects to be considered in the assessment of such systems at clinical sites that will help establish the required control.
Concerning the overlap these systems now have with those used at drug manufacturers and laboratories, which typically adhere to GMP and GLP, respectively, how does this guide assist users who operate in those adjacent spaces?
We have tried not to reinvent the wheel and focused our efforts on the systems directly connected to the conduct of the clinical trial. However, there is an interface to drug manufacturing as the investigational drugs also need to be produced in appropriate but small quantities. The production of the investigational product needs to follow established GMP guidelines and expectations, and it has not been considered in detail within the guide as this is already covered in other ISPE guides. However, the investigational drug may need to be packaged and labeled differently than marketed drugs to establish and maintain the blinding of a clinical trial. This is a critical aspect in many clinical trials as it ensures that the participant or the investigator can't determine which participant is getting the new investigational product or standard care or placebo. We have described the systems that are involved in this randomization process within our guide, outlined the major risks associated with the process and the supporting systems, and provided potential validation approaches.
Similarly, we have addressed the interface to laboratories that analyze samples taken from the participants. The validation of the systems used within the laboratory itself has not been within the scope of our guide as it also has been covered in other ISPE guides already. However, the specific requirements of the data transfer and management of these lab data into a GCP system while ensuring data integrity have been described in the good clinical laboratory practice section of the guide.
New to this edition is a focus on “ensuring compliance with applicable regulations with a particular emphasis on data integrity and dataflows considering challenges through outsourcing services and technology.” What is the rationale behind including this added purpose?
GCP systems have a broader user base than most other regulatory systems, including sponsors, clinical service providers like CROs, clinical sites, and trial participants. Due to that broad user base, it has become the standard to outsource almost all the necessary computer systems to collect and manage clinical trial data. Specialized technology service providers have developed zero-footprint, web-based systems that can be accessed easily by all authorized users regardless of the local infrastructure or end-user device. All collected data is analyzed and processed by clinical research service providers like CROs and, of course, the sponsor. These data transfers and activities need to be assessed for potential data integrity risks. This is important, as data must be collected and analyzed accurately and contemporaneously to detect potential adverse events or a lack of efficacy. Also, these data often form the basis for submissions for approval to regulators. The quality and integrity of the data therefore have a potential impact on the health and well-being of the future users of the medicinal product.
Finally, when and where will the second edition be available?
After 18 months of intense work by a team of nearly 50 industry experts from sponsors, CROs, technology providers, and consultants the guide was released by ISPE on July 31, 2024. However, this is not the end of our team activities. We are currently planning a series of webinars for specific aspects that are covered in the guide, which will be published at ISPE.org/webinars when available, like decentralized trials and AI solutions used in data management. Additionally, there will be presentations on similar topics at various ISPE conferences and events globally. As this area is evolving, we will address new developments in future articles for ISPE’s Pharmaceutical Engineering Magazine or other ISPE publications.
About The Expert:
Frank Henrichmann, senior executive consultant at Q-FINITY Quality Management, is an expert in quality management, computer system validation, and compliance, especially in the context of clinical trials and pharmacovigilance. Over more than 22 years, he has gained extensive experience in strategies, projects, and measures for GxP-regulated environments at a CRO as well as a major pharmaceutical company. In his current position, he helps life sciences companies and supports technology providers to find innovative answers to quality and validation challenges. He is a Qualified Instructor with ISPE, a member of the ISPE Clinical Systems Special Interest Group (SIG), and a coauthor of the “ISPE GAMP® Good Practice Guide: Validation and Compliance of Computerized GCP Systems and Data.” He has been an ISPE member since 2001 and currently is the co-chair of the GAMP Global Steering Committee.