Podcast

Why Use RBM And How Far Have We Come With Implementation?

In March 2019 I had the opportunity to interview Jennifer Newman, Global Project Leader, Regulatory Affairs/Clinical Operations for Celldex Therapeutics. Newman was once part of the largest implementation of RBM and was able to share insights from her experience. Specifically, she was able to discuss the benefits and challenges of RBM and what companies should be prepared for when adopting the technology. In this video, Newman discusses the history of RBM, how guidance has changed, and where the technology stands today.   

Click here to see the complete interview.

Transcript

Ed Miseta: Let’s start off with a short history of risk-based monitoring. There have been a lot of changes in technology and in regulatory. Can you briefly go over that and let us know where we stand today?

Jennifer Newman: Sure. I know that a lot of people in this peer group have just about as much experience as I have, maybe more. But I think the majority of us have all lived through the evolution of electronic data capture.

When we think about technology, we hope that when we implement new systems like that, that we’re doing it in a way that we are getting a benefit. Prior to electronic data capture being put in place, people would actually do double data entry and triple data entry from the medical records at sites into the databases. The process was very time consuming.

EDC was a direct way for sites to enter their data directly and get the sponsor access to their trial information right away. This had everybody very excited and showed how quickly we can have access to it.

But sometimes technology has a double edged sword, and I think that also created a big emphasis on 100 percent source data verification. We had speedy access to our data, so let’s make sure that every box is checked, and everything looks exactly as it should.

Unfortunately, what the data shows is that the data we have in our systems is pretty much accurate as it’s entered. If there is some correcting going on, it’s probably on the order of three or four percent of the data. With all this effort in source data verification, it became apparent that it’s awfully costly, time consuming, and rate limiting. Perhaps there’s a better way.

A couple of things then happened in parallel. One was major revisions to the GCP guidance, and there were three areas where that guidance shifted. It took emphasis away from being process and document-based and emphasized the goal of effective, good clinical practice is to ensure data quality and the protection of human subjects.

That really was a very clear, fundamental change in the most recent ICH revision. The second change was that it introduced concepts of risk-based monitoring, and an emphasis on relying on the technology. We have access to the data, so let’s make sure that we’re looking at the data in a centralized way, not worried so much about every single data point, and making sure that we’re not treating those data points equally, because they’re not equal.

Third, probably the most important change out of the revision was CRO oversight. With that, there is an emphasis on making sure that sponsors still need to maintain that quality oversight. That is something they cannot outsource. I think for small companies, this is a very interesting thing, because we do tend to outsource quite a bit.

That is what has gotten us to where we are today. I have a lot of conversations with people and I have found they are struggling with RBM, how to implement it, and where it all fits into the big picture.