In March 2019 I had the opportunity to interview Jennifer Newman, Global Project Leader, Regulatory Affairs/Clinical Operations for Celldex Therapeutics. Newman was once part of the largest implementation of risk-based monitoring (RBM) and was able to share insights from her experience. Specifically, she was able to discuss the benefits and challenges of RBM and what companies should be prepared for when adopting the technology. This Q&A highlights some of Newman’s comments. You can view the entire interview here.
This is part two of a two-part article. Part one can be viewed here.
Ed Miseta: Implementing a new technology solution is always a challenge, and one of the first challenges you’re going to face is getting internal approval. How do you go about doing that, and are there specific metrics or data you should focus on?
Jennifer Newman: Well, I think the biggest argument is that quality matters, regardless of whether it results in large cost savings. In fact, you may be spending more to implement a quality approach upfront. But, at the end of the day, if your data isn’t what it should be, you’re going to pay for that. And you may have to pay for it at a time you don’t want to.
I have heard a lot of cost arguments. In the large study that I worked on, there were some significant cost savings that were promised. Those did not come to fruition. We didn’t shave 50 percent off the cost of monitoring. I think realistically, you can reduce the cost of on-site monitoring using central monitoring and a risk-based approach. But I think it’s more in an order of magnitude of ten percent, and that may be more or less than what you’ve paid to implement it. The argument really is we need to comply with this regulation if we want to submit this data for a claim. It must hold up, and if we don’t have confidence in the quality, then we really can’t stand behind it.
If the organization is driven by time, then I would make that one of your arguments. If you look at your data on a routine basis, and can stay on top of it, then you will be able to lock your database much faster. That is because you will have a system in place that allows you to stay on top of things. Additionally, you no longer must worry about the things that are not impacting the overall quality of the study.
RBM allows you to deemphasize the pieces that are meaningless to the overall scheme of the study, in favor of understanding and having confidence in your primary endpoints. You will also have greater confidence in the overall quality of the data. That’s one argument for speed. Some organizations are very quality-focused, and I think those are the kinds of organizations where implementing something like this is probably easiest.
The cost argument is probably a bit weaker. But I think one benefit of having a more centralized approach is that you can share that data a little bit faster and have a bit more confidence in what it is, in real time. I think companies, especially small companies, are very interested in getting data as early as possible and as quickly as possible.
Miseta: Other than getting the funding for it, are there other challenges that companies will generally face in implementing this solution?
Newman: Absolutely, and I think it’s interesting. I think that you have just as many challenges on your team as you do convincing people to implement this and pay for it. What I mean by that is most of us have the background with 100 percent source data verification.
When we suddenly say we’re not going to do things the way we’ve always done them, we’re going to take a totally different approach, what you get is resistance. You may tell everyone that you are going to source verify 100 percent of this and you’re going to source verify maybe 20 percent of the less important points. Still, when you look at your metrics, you find people are still doing 100 percent source verification. It can be very hard for them to let go. I sympathize with that, because if you’re monitoring and you’re going on site, your job is to make sure that this data is what the site says it is.
It can be very hard to say that you are going to look at these fields and not look at other fields. I think you will find that some CRAs will still go out and do 100 percent source data verification, even when it’s not prescribed. It’s just a process. It’s changing hearts and minds. It’s making sure that the value proposition of implementing a change like this is clear to all team members above you and below you. Everybody needs to be on board, and that means everybody needs to champion the idea. Make sure everyone in the organization understands this is not voluntary. It’s mandatory. It is something we all must do.
But then, really make the case for why you are doing it. Start by dipping your toe in the water and allowing everyone in the organization to see the benefits of the change. From there you have a much easier time.
Miseta: What insights do you have about protocol design and how to include RBM in the concept stage?
Newman: Yes, absolutely. I started out by talking about terminology, and I think quality by design is very important. There is some resistance in organizations to adopting it, because it slows things down. It does so for a reason. You want to make sure that your teams are thinking about protocol and how we’re going to make sure that our systems are appropriate.
There is another thing that should be mentioned: If you are collecting information that you don’t need to, it may seem harmless. But any time you do that, you’re directing resources away from things that matter. Protocol design and effective monitoring plans are the transition from quality by design to risk based monitoring. Specify what your monitoring plan is. If your monitoring plan is just driven by a system, it’s driven by key risk indicators that you’ve proactively defined at the beginning of the study. Then you have set up your risk-based monitoring right at the beginning.
From there, it should be pretty much on autopilot. You’re going to be checking in and making sure that things are going according to plan. Also don’t forget that no system is going to be able to effectively capture every risk. There’s going to be things that come out of the blue and surprise you.
I’m sure we all have war stories, where things were unexpected. I think it ties together linearly. You set out to have a quality protocol, quality data collection tools, thoughtful plans to capture and monitor and report, and then you follow up with a very disciplined review of your data on an ongoing basis, and you tweak as you need to.
Ed Miseta: If you have a CRO who is handling your RMB activities, are there additional issues companies must deal with?
Jennifer Newman: That’s a great question. I know there are CROs that have RMB solutions, and I would just caution anybody who is using a CRO as well as their own RBM solution to be aware of vendor oversight. If you are using your CROs RBM solution, what is your role in oversight of that CRO?
You can do it two ways. You can either be very involved with the CRO and be involved when they’re talking about risk and key risk indicators that go on in the study. Or, you could bring in another vendor to oversee your CRO. Then I think the argument becomes a little tricky, because why pay two vendors when you only must pay one?
From a quality perspective, it makes sense. If you are going to have a CRO and you are going to outsource that piece, you either must remain heavily involved or you must bring in another vendor. There are independent vendors that will do that.
Ed Miseta: What was your biggest lesson learned in implementing RBM? If you could go back and do it all over again, is there anything you would have done differently, knowing what you know now?
Jennifer Newman: You’re always going to be surprised by something. People will go in the direction that you tell them to. If you say it’s important to get first patient in on August 1, people will move mountains to make sure that you get first patient in on August 1. If you tell them it’s important to make sure that there is no patient enrolled in the study that violated inclusion or exclusion criteria, that will be their focus. People will move mountains to accomplish those things they know the organization is paying attention to.
In any management structure, people don’t want to be dinged for the things that they have been told are important. As a leader, you really can set the tone. You must set your team in a direction where they’re focusing on what you have all agreed is important. When you do so, that is what you will get.
I do have an anecdote related to this. When I was involved in the large-scale system implementation I noted earlier, one of the goals was to try to identify sites that might potentially be at risk for a regulatory inspection. The thinking was that the regulatory bodies were doing using algorithms to try to pull out data flags, and then focus on those sites to make sure that they were inspection ready.
As a result, we spent a lot of time on those high-risk sites. It turns out none of them were inspected. That doesn’t mean it wasn’t a worthwhile exercise. It was. But one of the things that happened during the study was we did have a monitor on site doing very typical source document review.
That monitor had access to patients’ medical information. She’s at one site one day and at another site the next day, working on a different study. Lo and behold, she remembers that she saw the same patient name for another study. Basically, she uncovered that it was a patient who had enrolled themselves into multiple studies at the same time.
For me, that was a huge lesson learned, because there’s no system that you could put in place to capture something like that. As sophisticated as these things can be and as helpful as they can be, again, you still cannot replace the value of people being on site, your CRAs establishing relationships with your investigators and paying attention to the data they are reviewing. You can’t just walk away from those things. You absolutely need to take a holistic approach. That was one