How Regulatory Convergence Is Rewriting Clinical Data Management
By Life Science Connect Editorial Staff

Clinical trial data management is entering a period of accelerated transformation, driven by evolving global regulations and coordinated industry efforts to modernize how trials are designed, executed, and reviewed. Recent regulatory updates, including ICH E6(R3) and ICH M11, signal a decisive move away from document-centric, manually interpreted processes toward approaches that emphasize risk-based oversight, proportionality, and digital-first protocol design.
This shift reflects a growing acknowledgment of a long-standing challenge: Traditional trial execution relies heavily on human interpretation of narrative protocols. That reliance introduces avoidable ambiguity, increases operational risk, and slows timelines. Manual interpretation not only burdens sponsors and sites but also complicates regulators’ ability to clearly assess how study intent translates into execution and data outcomes.
During a Clinical Leader Live, panelists from Regeneron, Abbvie, and CDISC provided insight into the evolving regulatory expectations surrounding data collection and compliance, as well as how to leverage proactive understanding to improve oversight and safety. Speakers for the event included:
- Julie Smiley, vice president of data sciences, Clinical Data Interchange Standards Consortium (CDISC)
- Joe Fitzgerald, head of clinical data processing and reporting, Regeneron
- Anne Hale, head of study risk management and central monitoring, AbbVie
Ultimately, regulators and standards organizations are advancing a shared vision for structured, machine-readable trial design that supports transparency, traceability, and continuous quality oversight across the clinical development lifecycle. Taken together, these developments underscore the importance of expressing clinical intent in structured, computable forms that support consistent interpretation, stronger oversight, and more predictable trial execution.
Converging Standards: M11, USDM, and HL7 FHIR
At the center of this regulatory transformation is the convergence of ICH M11, the Unified Study Definitions Model (USDM), and HL7 FHIR. Together, these standards form the technical and conceptual backbone of modern clinical trial design. ICH M11 establishes a harmonized, standardized structure for clinical protocols, reducing variability in how study objectives, endpoints, and procedures are documented. “While the USDM is more of a foundational model that [supports ICH M11’s] structure with machine-readable metadata, ICH M11 is more of a template,” Smiley said. “When you layer that with ICH E6(R3), that obviously emphasizes risk-based quality management (RBQM) and then that connection becomes much more powerful. These structured protocol elements really enable proactive identification of quality factors, such as better traceability from the study intent to data collection, tabulation, and analysis.”
HL7 FHIR serves as the interoperability layer, allowing this structured protocol information to move across systems used by sponsors, CROs, sites, and regulators. This alignment represents a fundamental change in how protocols function. Rather than serving solely as static reference documents, protocols become executable blueprints that directly inform downstream processes such as data collection, tabulation, analysis, and submission. “For those involved in data management, you read a protocol, you interpret, you look at your schedule of activities, and then you build specifications for your EDC system, for example,” Smiley said. “With the digital protocols stored in the USDM, you can actually automate all of that. Now, obviously some systems have proprietary elements for which you may have to define certain rules, or even edit checks that you already likely have standards for, but the digital protocol does enable a lot more automation end to end and takes out a lot of that manual interpretation.”
Supporting Risk-Based Quality by Design
The move toward structured, digital protocols directly supports the objectives of ICH E6(R3), which emphasizes RBQM and proportionate oversight. By embedding clarity and structure at the protocol level, sponsors can more easily trace study intent through execution, making it clearer how critical risks are identified and managed.
For regulators, this transparency improves confidence in how sponsors define critical-to-quality elements, establish quality tolerance limits, and implement corrective actions. For sponsors, it reduces reliance on exhaustive source data verification and routine monitoring activities that may not meaningfully improve quality. Instead, oversight can focus on where it matters most. “I think that's really what the regulations are saying: Don't wait until the study's going on to say, ‘What's your critical-to-quality factors?’ It's really starting up front and setting those parameters up front,” Hale explained. “Really setting the strategy for ‘How are we going to review it? Who’s going to review it?’ And then really if we think about R3, it's really saying ‘How are we communicating to those sites?’ ‘What is critical?’ and ‘Where are we supporting them across this journey?’ I think that is the difference.”
This shift reflects a broader regulatory evolution away from procedural box-checking and toward quality by design. Regulators are increasingly concerned not with how much data is collected or how often sites are visited, but with whether risks are understood, documented, and proportionately managed.
Enabling End-to-End Digital Data Flows
Parallel to regulatory guidance, initiatives led by organizations such as CDISC are advancing digital, end-to-end data flows that reduce ambiguity and minimize errors throughout the trial lifecycle. By leveraging structured metadata, executable conformance rules, and interoperable standards, these initiatives aim to close long-standing gaps between protocol design, trial execution, and regulatory submission.
When protocol definitions, case report forms, and submission datasets are aligned digitally from the outset, discrepancies are reduced and downstream reconciliation effort is minimized. Continuous quality monitoring becomes feasible, enabling earlier detection of deviations and reducing the need for retrospective remediation. These capabilities are increasingly essential as trials incorporate more data sources beyond traditional EDC systems, including digital health technologies, labs, imaging platforms, and real-world data.
Beyond Technology: Implications for the Industry
While the technical foundations for digital trials are rapidly maturing, adoption requires more than system upgrades. Organizations must adapt to a regulatory environment that expects intentional design, documented risk rationale, and transparency across the trial lifecycle. “The good thing is, regulators are telling us, especially through the lens of ICH, that it's okay to not do everything,” Fitzgerald said. “It has been [reinforced] for 20-plus years that we have to do everything in order to have high quality or to say that we can fully stand behind the integrity of our data, and you just don't need to do that. It's not necessary from a statistical mathematical standpoint. It's not necessary from a design standpoint. And it's definitely not their expectation any longer.”
Sponsors, CROs, and technology partners face a common challenge: realigning legacy systems and processes to fully realize the benefits of structured protocols and interoperable data. Those that succeed will be better positioned to demonstrate regulatory confidence, improve operational efficiency, and support faster development timelines. The digitization of clinical trial design is not simply a modernization effort — it is a redefinition of how evidence is generated, managed, and trusted.