From The Editor | September 19, 2017

How To Improve CRO Oversight With Data & Analytics

Ed Miseta

By Ed Miseta, Chief Editor, Clinical Leader

How To Improve CRO Oversight With Data & Analytics

Karen Brooks holds the title of senior director of clinical operations at Adare Pharmaceuticals, but the title may be misleading. With Adare being a growing specialty pharmaceutical company, Brooks is currently the entire clinical operations department. That means she wears many different hats, including operations, monitoring, management, recruitment, and retention. At the inaugural Clinical Leader Forum in Philadelphia, Brooks discussed her role in overseeing CROs.

Brooks has worked for both CROs and sites, but also worked in Big Pharma before entering the specialty pharma world. Very early on, it became clear to Brooks that she would be a clinical “N of 1” at Adare. During the interview process, she asked the company CEO about her clinical operations team. His response was, “That would be you.” Everything from data management, biostatistics and vendor oversight would be her sole responsibility.

The challenge did not sway Brooks, but she knew that as the sole clinical operations employee, she would have to run a streamlined operation. “At a smaller company you have to be streamlined because you don’t have time to go through 30 or 40 SOPs,” she says. “You don’t have time to go into multiple systems. In an attempt to understand my new job, I reached out to five former colleagues who were working at small companies. I found the CRO oversight approach they all employed was remarkably similar.  Everyone stressed that in a small company, it is important to make sure all stakeholders are playing from the same sheet of music.”

Define Your KPIs

First, Brooks recommends identifying and ranking the key areas of importance, noting that sponsors and CROs will not always prioritize everything the same way. Therefore, once you decide what is most important to you, discuss these expectations with the CRO and align on a mutually agreeable path forward.

This process entails defining the key performance indicators (KPIs) that are critical to success. That should be followed by a discussion of how both parties will share accountability for meeting these specific KPIs. “The healthcare industry is evolving rapidly. We need to ensure we’re keeping up with this evolution and exploring ways to partner more effectively with CROs,” states Brooks. “By entering into contracts that are more closely linked in value to the achievement of KPIs – sharing the risk, delivering on the goals of each organization – both parties are equally accountable for driving results.”

The next step is oversight. Brooks relies heavily on the CRO’s data analytics, which entails trusting and relying on those systems, and knowing they will meet her needs.

“As an ‘N of 1,’ I am overseeing several trials at once, and I have many expectations placed upon me,” says Brooks. “I will tell my CROs the analytics I get from them have to be brief, accurate, and timely. I always emphasize the ‘brief’ part of that. A solid partnership means knowing that my CROs can prioritize the information I need to act on.”

Life Science Training Institute

Find out what expectations have changed as a result recent ICH GCP E6 updates in the webinar:

CRO Oversight: Post ICH E6 GCP R2 Addendum.

 

 

Next, Brooks will stress the timeliness aspect. Good analytics will help you make effective decisions downstream. In a small company, those decisions have to be made quickly since they can be costly to the company.

Set Expectations And Hold Accountable

Since accurate reporting leads to timely decision making, Brooks stresses the importance of setting your data analytic expectations. She uses enrollment as an example of this. We all know studies often fail to meet enrollment goals. But simply telling her the enrollment goals have not been met will not help Brooks make necessary decisions.

If enrollment is falling short of her goal, Brooks wants to know why. What sites are meeting their goals? Which ones aren’t? How many sites are actually up and running? “If I was told there would be five sites activated this month, I want five sites activated this month,” she says. “This is one of the areas where an ongoing partnership is crucial. Because many of these milestones are incrementally linked to the value of the relationship, I charge my CRO with helping me determine how we will stay on track and course-correct as needed.”

Brooks also stresses her expectations in regard to the accuracy of the data. To make accurate decisions, you need accurate data. Or, as Brooks puts it, “All of the planning doesn’t mean anything if the quality of your data is poor.”

Although not technically part of her oversight process, Brooks does look at the knowledge level of staff members at her CRO. This will generally come up during audits or due diligence. Brooks notes she will actually quiz CRAs about the trial protocol. “I think that is a very important aspect of CRA performance. The CRO might have an experienced CRA with a great CV, but what is most important to me is that they understand the protocol and can deliver on it.”

What, Exactly, Should You Review?       

When reviewing the metrics she receives, one of the first things Brooks will focus on is quality. To determine if a CRO is delivering on quality, Brooks likes to look at protocol deviation reports. She will also review action items, as that will show her issues that have been open for a long period of time. “Action items will often mean the site or CRO is not resolving things,” she says. “Things should not remain open for a long period of time. I also look at data entry timelines or how long it takes for a site to get patient data entered into the system. That task should always be performed in a timely manner.”

Query reports are another concern, and Brooks will often look at the number of queries per site. If there are a large number of queries, that could mean the site does not fully understand the study protocol. That metric could also indicate the instruction received by the CRAs is in some way inadequate.

Brooks is also a proponent of patient recruitment and will review recruitment timelines, screen failure rates, and subject withdrawal. Having worked as a study coordinator, Brooks believes a high screen failure rate requires further investigation in site training of patient identification or protocol inclusion/exclusion criteria. A high number of patients withdrawing from a study might mean a challenge with study design or execution at the site level or patient fatigue within a trial. 

Finally, Brooks believes in real-time feedback. There are times when protocol amendments are necessary, or milestone dates have to be changed. When that becomes necessary, she puts the changes in writing. If the CRO does not agree, it can be immediately discussed.

“Like anything in life, if you give someone immediate feedback, they are able to redirect their path,” she adds. “There is no point in waiting and letting them know about it later. For me, it’s just like a performance review. Nobody wants to get to the end of their review period and then find out they were doing something wrong. It is better to let them know how they are performing every step of the way. It should be a two-way partnership, and there should not be any surprises.”