Digital Protocols Are At An Inflection Point: A Conversation With Novartis And TransCelerate Leaders
A conversation with Rob DiCicco, vice president of portfolio management, TransCelerate BioPharma, and Bill Illis, executive director, technology & scientific computing, Novartis

Interest in digital approaches to clinical protocols has grown steadily as sponsors, solution providers, sites, and regulators explore ways to make study planning and execution more efficient. Organizations are beginning to look at how standardized, structured study definitions — supported through industry initiatives, controlled terminology, and data models — help reduce manual effort and enable rapid and more consistent downstream study execution, reporting, and submission processes.
In this conversation, Rob DiCicco, vice president of portfolio management at TransCelerate, and Bill Illis, executive director, technology & scientific computing, Advanced Quantitative Sciences at Novartis, and workstream leader of the Digital Data Flow (DDF) initiative at TransCelerate, share their observations on industry momentum, early innovations, and what they expect to see over the next few years.
Clinical Leader: How has industry engagement with digital protocol standards evolved? What signals or feedback make you feel the ecosystem is moving in the right direction?
Bill Illis: Looking back over the past couple of years, we’re seeing steady acceleration and signs of growing momentum. One clear indicator is attendance at our annual DDF events. Three years ago, we had about 50 attendees; the next year, around 120; and this year, roughly 240, plus a waiting list. That alone suggests interest is growing.
We’ve also gone from zero to six sponsors who have publicly presented their adoption stories. Beyond those, we’re aware that the majority of TransCelerate member companies, along with several non-member sponsors, are working on projects.
On the technology provider side, our Solution Collaboration Forum has grown from seven founding members to more than 45 technology companies. That’s important because tech sector participation is essential for demonstrating to these companies the benefits of making their solutions compatible with digital protocol standards.
Regulators are also moving forward with their own efforts. The ICH M11 protocol guideline, protocol template, and technical specification were formally approved by the ICH Assembly last month. M11 establishes a global and harmonized structure for protocol content, and CDISC provides the controlled terminology used within that structure. From 2022 to 2024, the Unified Study Definitions Model (USDM), one of the critical components of DDF, was developed by CDISC with TransCelerate’s thought leadership and financial support, and with input and testing from sponsors and tech companies. In 2025, it was further revised to be compatible with ICH-M11 technical specifications. This compatibility allows sponsors to represent protocol information in a machine-readable format that aligns with the newly approved guideline.
Against that backdrop, the release of version 4 of the USDM earlier this year was also a key milestone. In earlier years, people often said, “This makes sense, but the model is still evolving.” With version 4, we now have a mature, stable model that people feel confident implementing, and that shift is critical for organizations that have been waiting for stability before moving forward.
Rob DiCicco: I’d add that we’re not just reaching more companies; we’re reaching more diverse roles within those companies. Early on, most of the interest came from standards experts, data managers, data scientists, and programmers. More recently, at SCOPE EU, medical writers attended specifically to learn about this work. A European medical writing association has also reached out to CDISC to learn more. Engagement has broadened significantly across functions.
What unexpected challenges or lessons learned have organizations encountered as they move from proof-of-concept to operationalizing digital protocols?
Illis: Whether they’re unexpected or not, there are definitely challenges. One that repeatedly comes up is change management. It’s not just about the technology; it’s about business processes, roles, and skills. Historically, clinical development functions have operated with a high degree of independence. A head of data management might implement a new system or processes without extensive cross-functional coordination, and clinical operations, biostatistics, or regulatory affairs could do the same.
Digital protocols change that. They introduce a single source of truth that ties together downstream processes across multiple functions. That requires closer collaboration, sharing study metadata and processes in ways that may not already exist. Depending on whether an organization is centralized or decentralized, that behavioral shift can be significant.
Technology readiness is another challenge. To achieve data interoperability across systems, a foundation on standards has been demonstrated across other industries to be an important enabler. For a long time, we saw a “chicken and egg” dynamic: sponsors asking whether vendors would adapt their products, and vendors asking whether sponsors would license and use them. Getting both sides moving at the same time has been a challenge, but we’re now seeing that gap begin to close.
DiCicco: In a world where all the tooling already existed, uptake might be very quick. But we’re in a place where tools are being developed in parallel with the standards. Smaller companies are innovating based on what they see coming out of this. Sponsors still need tooling — either from existing tech partners or new ones — to put digital protocols into practice. So, there’s a time lag. But we’ve clearly helped spark a lot of innovation.
Illis: Another lesson has been the need for additional support structures. Sponsors need clear visibility into solution provider progress, so we created the Solution Directory, which allows vendors to post information about their solutions. They asked for some ideas around potential use cases, so we developed the use case index and library, and we invite people to provide feedback on those. And both sponsors and vendors needed a way to check conformance with the standards, so we worked with CDISC, which developed conformance rules. Those rules were released in written form with version 4, and we’re working with CDISC to create an executable set so people can download the code and confirm whether their implementation conforms.
DiCicco: Tech company input has informed earlier releases of the USDM, SDR, and APIs, including enhancements that support automation use cases. The sector has also expressed interest in more training tailored to their needs. While TransCelerate does not offer vendor-specific or system-specific training, the Solution Collaboration Forum allows participating vendors to exchange perspectives and learn from one another as the ecosystem evolves.
Illis: And more broadly, we’re trying to create the conditions for innovation, supporting the adoption of the standards. One example is sponsoring challenge events that showcase the use of standardized digital protocols. CDISC ran a challenge using AI to convert PDF protocols into a digital format. We’re running a new challenge with SCOPE that looks at how digital protocols could improve the protocol review process for sponsors, sites, ethics committees, and regulators. We’re not delivering the solutions ourselves but enabling a framework for the community to test ideas, demonstrate approaches, and build momentum.
How do you see AI influencing the need for digital protocols or shaping digitalization going forward?
Illis: There are two main areas. One is converting historical protocols — studies that are ongoing or completed — into a digital format. That’s an active area of experimentation, including whether AI can help convert traditional document-based protocols to digital ones. If that proves robust, it could even lower the change barrier for medical writers by converting protocols they author today into a structured digital format.
The second area is using AI once you have a repository of digital protocols. Structured, interpretable data is essential for effective AI. If you don’t have good data going in, you get inaccurate results or hallucinations. With standardized digital protocol data, AI models can drive insights for future study designs and create a much stronger analytical base. We sometimes hear, “If I have AI, why do I need digital protocols?” Our view is the opposite; they’re complementary.
DiCicco: I believe organizations will get value faster from their AI investments if they leverage digital protocol tools to structure and standardize their data. It is a principle in any other industry where AI has been successful in driving efficiency. They can spend more time deriving insights rather than moving information around. It also connects with areas organizations are already exploring, such as using AI for regulatory document generation and supporting greater consistency across regulatory document workflows as programs progress.
What is the overarching value proposition or business case for adopting digital protocols?
Illis: Many people ask for a concise explanation of the business case. Conceptually, I think about it across several dimensions. One is improving study design through better benchmarking, computation, and analytics, which can lead to more effective study designs. Another area is improving the protocol review process. There's an opportunity to create views tailored to different subject matter experts — clinical, safety, regulatory, and statistical — supported by consistency checks and other enhancements.
There’s also streamlined study start-up. Today, many systems, including data collection systems, trial management systems, and systems at clinical sites, rely on protocol information that has to be manually typed in from documents. Automating that from a digital protocol can remove a lot of manual effort. And then there’s end-to-end data flow: helping ensure that the data specified for collection makes its way more efficiently toward analysis and submission.
We don’t yet have a documented set of quantified benefits numbers. We’re aiming for cycle times shifting from months to weeks, and weeks to days, as well as quality improvements and efficiency gains, all driven by the availability of standardized, executable, machine-readable study protocol data. Over time, we aim to collect that data about benefits from early adopters and break that down more systematically.
DiCicco: I’ll borrow a phrase I heard recently from a discussion about scaling community-based research. One of the speakers talked about moving toward automated workflows where “the protocol is the operating system.” I don’t think he meant the traditional PDF or Word document when he said “protocol.” He meant the information inside the protocol sitting in a digital tool — whether that’s our Study Definitions Repository or another platform with similar capabilities.
What we’re really talking about is automated workflows across sponsors, investigators, and third parties, with the protocol data acting as the operating system for those workflows. For anyone even moderately technically savvy, that idea captures the value very clearly. But that can only happen with widespread adoption of standards harmonizing terminology and data models.
For organizations that want to start with digital protocols but don’t know where to begin, what do you recommend?
Illis: A good starting point is the DDF Practical Approach to Implementation document we recently published. It offers guidance on how to build a business case and get started.
From there, I’d recommend defining a cross-functional team. Understand which functions in your organization will be impacted — clinical, data management, medical writing, biostatistics, regulatory, IT, etc. — and build awareness and understanding across that group.
Then develop a strategy. There isn’t one pathway. Some organizations may do de novo digital authoring; others may start by converting PDF protocols, building a repository of historical protocols, or digitizing specific components such as the schedule of activities or inclusion/exclusion criteria.
Prioritize use cases. We have identified more than 25 use cases today, and in reality, there are probably twice as many. No one is going to implement all of those at once. Prioritize what’s most important to you, then expand.
And finally, invest in training. CDISC has technical training available, and we’re about to release more business-oriented training. People need to understand the standards in depth. I often describe a change curve: from awareness to understanding, to commitment, to action. Training is essential to moving along that curve.
Looking ahead, what do you expect the next two to three years to look like for digital protocols?
Illis: I tend to frame this in terms of an adoption journey. There’s a book called Crossing the Chasm that describes how innovations are adopted, starting with innovators and early adopters, then moving toward the early majority and beyond. There’s an inflection point where innovation becomes the expected way of working. The shift is when people stop asking, “Should I adopt digital protocols?” and instead say, “Everyone is doing this. I’d be crazy not to.” The question becomes when and how, not if.
DiCicco: A big part of that is stability. People have been waiting for the point where the standards are mature enough that they can invest without worrying about constant retooling. With USDM version 4 out and no major new release planned for 2026 — beyond bug fixes and alignment with ICH M11 — we’ve hit that inflection point. If you implement now, you don’t have to worry that a major new version is coming in a few months. It signals that the content is stable enough, and it’s good enough now. Making sure that we are supporting implementation and listening to feedback on how to improve will be critical. Likewise, publishing both successes and failures will be a catalyst to help the broader community.
About the Authors:

Bill Illis is executive director, technology & scientific computing, advanced quantitative sciences, at Novartis. He is responsible for developing and implementing the analytics technology strategy focused on end-to-end clinical data flows, systems, and processes. For the past several years, he has served as the workstream lead for the TransCelerate Digital Data Flow initiative, advancing digital protocol solutions and supporting the development of standards in collaboration with CDISC. His previous experience includes roles at Ciba-Geigy Pharmaceuticals and Dun & Bradstreet, complemented by educational qualifications including an M.P.H. in Biostatistics from the University of Michigan and a B.A. in psychology from Providence College.