Guest Column | October 23, 2018

Blame Shouldn't Be Filed In The TMF: A Call For Trial Master File Process Improvement

By Thomas Cocciardi, LMK Clinical Research Consulting

Blame Shouldn’t Be Filed In The TMF: A Call For Trial Master File Process Improvement

“A bad system will beat a good person every time.”1 This quote by legendary management thinker W. Edwards Deming introduces the fundamental concept underlying process thinking. Process thinking, as the name implies, is a human factors-derived philosophy concerned with viewing the world through a process-oriented perspective. Processes are the essential components of our systems that enable them to execute their purpose: any set of steps designed to achieve an objective can be considered a process. The wide scope of this definition reflects the abundance of processes in all areas of our lives – both inside and outside the workplace. Based on this definition, a process is an objective-driven task.

Although Deming and his contemporaries may have been more interested in the manufacturing of cars than in the creation and maintenance of an electronic Trial Master File (eTMF), both Deming’s quote and process thinking remain relevant to any industry for which success depends on successfully managing the quality of complex systems. All types of processes, from baking cookies, to clinical trial start-up, or designing a quality control program to achieve TMF inspection readiness can be analyzed and optimized from a process thinking perspective. Embracing process thinking doesn’t mean that humans are an unimportant part of systems. Instead, it helps us to recognize that it’s more productive to design a system around humans rather than forcing humans to conform to a system that doesn’t suit their needs. In the framework of clinical trial operations, it is hard to imagine any system that fails to meet the needs of its users more reliably than the eTMF.

Blame Game

As one of its core tenets, process thinking rejects that the best way to solve every problem is by breaking it down into tiny constituent parts. Although there are many pitfalls associated with excessively perusing reductionism, one of the most common and destructive forms of harmful reductionism is blame. I’ve written before about the assignment of blame. Blame is a topic I’ve been thinking about since interviewing process improvement and human factors guru Keith Doricott. In one of his blog posts, Doricott sums up the sacrifice we are making when we choose blame: “Blame is corrosive. Most people don’t want to open up in an environment where people are looking for a scapegoat so your chance of getting to root cause is much less.”2 Simply put, blame deprives us of the opportunity to learn from our mistakes.

In the TMF, the typical blame scenario takes this form: “If people only did their job correctly, the TMF would be in great shape and we wouldn’t be dealing with any of these issues.” Process thinking seeks to combat this damaging heuristic with the 85/15 rule. The 85/15 rule, developed by early quality improvement pioneer Dr. Juran proposes that for every given problem in a system, 85 percent of the issue lies within the configuration of the process, while 15 percent of the problem is under the control of employees.2 While the 85/15 rule isn’t vigorously backed by data, it is a principle designed to encourage us to look for “how” instead of “who” in the face of a problem. Process thinking forces us to recognize that most workers don’t have the authority and know-how necessary to grapple with a flawed process, especially on their own. Reflecting on the challenges with TMF’s human factors and TMF stakeholders’ unbalanced focus on human culpability over process improvement, let’s investigate two common TMF problems that are unfairly attributed to poor human performance and analyze them through a process-thinking perspective.

Problem: Duplicate Documents

The Problem: Our TMF is filled with duplicate documents, which makes it hard to tell what’s been filed. Massive quantities of duplicates are being created when people file multiple iterations of the same document, either in the exact same artifact folder or in multiple artifact folders within the TMF. There are so many duplicates that nobody knows how to start the process of getting rid of them.

The Scapegoat: Our clinical trial assistants are lazy and don’t check if a document has already been filed. Our CRAs keep sending the same site documents to us because they don’t care about the TMF. The subject matter experts in statistics and data management don’t think that using the TMF is part their job so they don’t file anything in the TMF until we force them to at the end of the study. The study managers don’t know much about the TMF and are reassigned to a new study before it becomes their problem.

Potential Process-Related Root Cause (One of the simplest ways to arrive at a root cause is by asking increasingly specific why questions):

  • Why do people file duplicate documents? They think a document should be filed but don’t realize it’s already in the TMF.
  • Why don’t they know what’s already filed in the TMF? They can’t remember every single document and didn’t choose to check what’s already there.
  • Why didn’t they check what’s already filed in the TMF? They are too busy, and it takes too long to check. Even if they do check, sometimes it’s not obvious what’s been filed.
  • Why does it take a long time to check what’s filed? Why isn’t it easy to see what’s already filed? The eTMF system makes it difficult to visually compare documents. Documents can contain the same information but still be different files. The document names are often incorrect or unclear.

Proposed Process Improvements: Unfortunately, eTMF systems don’t provide an advantage when compared to paper in the domain of duplicates. If making duplicates incurs a cost (like having to carry around more paper, spend more time filing, or waiting for the printer) then duplicate production will be low. In current eTMF systems, the barrier to file a document is quite low. In fact, it’s significantly easier to file a document than to read and understand the contents of an artifact folder. To eliminate duplicates, eTMF systems must make it easier to check a TMF’s contents and confront the human challenges of working with a nonphysical medium. This could potentially mean restructuring the filing process to trade some convenience for accuracy. Potential improvements should include, at minimum, automated duplicate checking functionality. As duplicates breed more duplicates, the sooner a duplicate is identified and removed, the greater the overall reduction in total duplicates.

Any eTMF system that makes it easier to adopt naming conventions will also reduce duplicates. Finally, as a more forward-thinking improvement, eTMF developers should carefully consider how documents will be displayed as part of the user interface. In too many eTMF systems, browsing documents feels like running with blinders on. eTMF users want the same type of lucidity that comes with printing out documents and spreading them out on the conference room table.

Problem: Late Documents

The Problem: The team rarely moves a document into the TMF before the filing deadline outlined in the TMF plan. Sometimes it takes months for a document to be filed. It seems documents are filed late when people are busy or not used to filing documents in the TMF. The study manager has been consistently mentioning the issue in our meetings but we’re still falling further behind.

The Scapegoat: Everyone on our team ignores the filing deadline in our TMF plan. The plan states documents must be filed within 10 days of finalization and here we are calling sites for documents months after close-out.  We have too much to do during the study to file documents shortly after their creation.

Potential Process-Related Root Cause:

  • Why are documents late? Documents are not filed within a predetermined time after their creation.
  • Why aren’t documents filed in a specified amount of time after creation? The document is “living,” and no final version has been produced. The document owner doesn’t have access to the TMF. There is nobody assigned to file the document. Filing the document is not a priority task. Nobody knows the document is late. Focus was shifted away from the document as soon as it was finalized but before it was filed.
  • Why are these barriers considered acceptable? We don’t really use the TMF to store our documents.

Proposed Process Improvements: Lack of contemporaneousness in a TMF is a multifaceted problem. In general, industry’s response has been to reduce the overall time it takes to file a document (which likely contributes to the duplicate problem we are now facing). Given that the lack of contemporaneous filing persists even though the current speed of filing a document in the eTMF is nearly instantaneous, there must be other factors at play contributing to the filing delays.

Lack of clearly delegated responsibilities, poor document handling practices, slow QC processes, and difficult access requirements often play a contributing role. However, out of all the contributing factors to lack of document contemporaneousness, “nobody knows the document is late” is probably the most frequently cited. eTMF systems have responded to this need with the introduction of document placeholders. Document placeholders are virtual markers within an eTMF that signify a document is expected but has not been filed yet. Document placeholders are useful, especially for site-level documents, since they can be efficiently cascaded across multiple sites. Unfortunately, placeholder features are often underutilized due to the burden of setting them up correctly. Given the challenges of most eTMFs, TMF experts are preoccupied with pressing quality control issues, making placeholders seem nice-to-have but nonessential.

Placeholders and other features, however, fail to address a huge contributor to lack of contemporaneousness: the shadow repository. Rarely do clinical trial stakeholders fail to produce or finalize a document, but they are often late in uploading them to the eTMF. This begs the question: Where are the documents stored in between creation and filing? We all know the answer:  the true “working TMF” is in disparate desktop folders, SharePoint systems, shared drives, and email inboxes. But why do we choose these repositories over expensive, feature-rich, industry-tailored eTMF systems? Because these unofficial systems are familiar, easy to use, flexible, and already integrated into our business processes. They are the systems in which we choose to create the documents in the first place. Obviously, a TMF must balance the needs of its users with regulatory requirements. Still, a new balance must be struck to reconcile the ideal TMF with the real world. Until the eTMF gains ground on these shadow repositories, contemporaneousness will continue to be a leading threat to TMF inspection readiness.

Modern Tools, Antiquated Processes

We must face the facts: many of these fancy, expensive eTMF systems are simply a 40-year-old paper TMF emulated in a digital environment. Your eTMF may be a modern tool, but your eTMF processes have changed little over decades. Much of this stagnation can be attributed to the need to meet regulatory requirements, which were designed around paper documents. New regulation saddled with old ideas is a major reason why, despite rapid advances in technology, the time-to-market of promising drugs continues to be over a decade.4 For the eTMF to enter the modern era, regulation must lead, not follow, and eTMF leaders must be given the freedom to accept regulatory risk while pursuing process improvement.

We can’t fully blame the regulations. There is much latitude for improvement within the current paradigm. eTMF systems are antagonistic. QC processes are punitive. eTMF users are human and imperfect. Much discussion around the TMF focuses on what should be rather than what is. Our current systems can be improved by focusing on basic processes: creating user interfaces that increase transparency, confirming our TMF tools work alongside the other electronic tools we use every day, and making it easy to correct mistakes. Yes, there are barriers to short-term change. Yes, we humans are ultimately the masters of the TMF, but we can’t take all the blame. We humans have been trying for years to improve. If we are unhappy with the eTMF processes we have created, perhaps it’s time for us to change them.

References:

  1. http://quotes.deming.org/authors/W._Edwards_Deming/quote/10091
  2. https://www.dorricottmpi.com/category/errors/human-factor-analysis/
  3. http://www.cbinet.com/sites/default/files/files/Snee_Ron_bonus%20material%202.pdf
  4. https://www.sciencedirect.com/science/article/pii/S2452302X1600036X

About The Author:

Thomas Cocciardi is a technical writer at LMK Clinical Research Consulting who is committed to expressing the human story behind each trial master file (TMF). In addition to technical writing, he also works as a regulatory writer specializing in FDA CTP submissions. Cocciardi has gained wide-ranging experience, both in and out of the pharmaceutical industry, working as a TMF-focused clinical trial associate, TMF health specialist, clinical research coordinator, preclinical data coordinator, intern medical writer, and intern brand writer. He holds an M.S. in regulatory affairs for drugs, biologics, and medical devices from Northeastern University.