From The Editor | August 20, 2015

Increased Access And Use Will Maximize The Value Of Data

Ed Miseta

By Ed Miseta, Chief Editor, Clinical Leader

Increased Access And Use Will Maximize The Value Of Data

The collection and use of data in clinical trials has changed dramatically over the years. Just 15 or 20 years ago, researchers were generating volumes of valuable data in both the lab and the clinic. Yet somehow the isolation that existed between those two worlds prevented optimal value creation (getting therapies to patients faster). The result was frustration amongst researchers and a feeling of missed opportunities.

“We had researchers focused on their specialty areas and doing great work,” says Peter Covitz, Senior Director of Information Strategy & Analytics for Pfizer. “But for anyone that was thinking more broadly about the sharing of this data across the enterprise, or even between public and private entities, there was a sense we were not fully leveraging the data we had.”

For R&D to provide value to the industry, three things have to happen. First, data has to be generated. Pharma has always been good at generating data. The more data the industry has, the more likely it is for cures to be developed. Covitz notes a lot of work has been done in both industry and academia to generate more data, and ample attention has been paid to it. While this has been a never ending challenge, he notes the bigger issue has been around prioritization, or deciding which data should be generated.

The second thing that has to happen is access. “This is the real challenge for many researchers,” says Covitz. “If the data you need exists, but you can't get to it because it was generated by another department, or another company altogether, then obviously you are not going to be able to use that data to generate value.”

The last step is data use itself. Someone might have access to the right data, but if they don't have the right skills and the right cross-functional team structure, then they are still not going to be able to generate value from the data. Scientific, clinical and analytic capabilities are all essential.

“Data generation, access, and use are the three, big elements that have to come together to really accelerate product development and drive value for patients,” states Covitz. “The challenges we faced 20 years ago resulted from the increasingly high volumes of data being generated that made data access difficult. The need for cross-functional teams to facilitate data use were only vaguely being addressed, and rather unevenly at that.”

Progress Is Being Made

What has changed over the last 20 years is the amount of progress made in tackling the problems associated with access and use. While the issues have not been solved completely, a number of advancements have enabled noticeable changes. Covitz notes technology has certainly improved the access challenge, making it easier and more straightforward to both share and manage data. We are also starting to see some benefits from data standardization efforts. Although still a work in progress, where standards have taken hold, there has been a lowering of barriers to access, including better access to the data in a form that can be used.

In terms of the skillsets needed to make proper use of the data, a number of academic training programs are now in place to populate the labor market with employees having a more diverse set of skills. While all the necessary skills may not exist in one person, individuals now have access to training programs that did not exist twenty years ago.

“Individuals need to have enough training to be conversant in the technical areas of others, but also be a specialist in their own area of expertise,” notes Covitz. “That means we need more people with mathematical and quantitative skills, but also biomedical training and a solid understanding of the life science industry. We are getting better, but still have a ways to go.”

Training in colleges and universities has been stepped up at the undergraduate level, which is good news for the industry. The academic community has also done a good job of creating Masters-level programs that give students more focused training in informatics, data analysis, and life science awareness.

“Someone today could have a bachelor’s degree in biology and then get trained in informatics,” says Covitz. “These informatics programs did not exist twenty years ago. Someone with those skills is incredibly valuable to this industry. More traditional quantitative disciplines, such as biostatistics, are dynamic and evolving. I think the methodologies that have come out of them are being increasingly adapted to a number of different scenarios. As a result, life science challenges are being looked at more and more as attractive problems for people who want to conquer this new frontier.”

Covitz mentions algorithm development as another dynamic area that is attracting more individuals with heavy quantitative skills. With jobs for these workers outstripping the available supply, individuals holding these skills are in high demand.

Technology Improvements And Standardization Are Vital

Improvements in technology will continue to be vital to accessing and getting value from data. Covitz believes the approach that seems to be bearing fruit is where customers adopting new technologies insist there be a usable programming interface to the data. The classic closed system, vendor-locked-in model is not gone, but is increasingly being discredited as a viable approach to delivering technology to this sector.

“Historically, many companies have been proud of their lock-in model, and would, in fact, tout it in their financial statements,” he states. “Their approach was not covert – it was out in the open for everyone to see and was an integral part of their business model. For many, it was accepted as the way the industry worked. However, I think we are starting to see cracks in the armor. That approach is increasingly untenable in an era when scientific information as well as patient health information needs to flow more freely. New products coming out are certainly more open than older legacy systems, and other companies are finally opening portals into their longstanding products.”

Covitz notes the persistence and tenacity of the folks who have been working collaboratively to develop industry standards is also beginning to pay off. One example is the reporting of clinical trial data to the FDA. Companies that report to the FDA will have a real incentive to adopt internal standards as they conduct their clinical trials. “You start to see standards awareness creep into the organizations conducting the R&D. There is a ripple effect and they start to see that value in other ways and other places where data exchange occurs.”

In some cases the problem wasn’t the lack of a standard; it was the existence of too many over-lapping standards. In those instances the industry will have to decide on the ones they wish to coalesce around.

Sharing Data Benefits All

Another hot-button issue in the clinical space is the re-use of study data. Covitz believes this will continue to be a topic of conversation. Part of the frustration from 20 years ago was born out of organizations that were generating huge volumes of data. Clinical trial data would be used for its intended purpose, perhaps resulting in a regulatory submission, but was then filed away somewhere.

“Regardless of where the data was stored, it was difficult to access by someone who had a research question but was not a part of the original study team,” says Covitz. “There was neither the technology foundation nor the organizational foundation to make trial data available and reusable. There has been a tremendous amount of progress on both fronts in the academic sphere. And, with the more recent data transparency initiatives, companies are now starting to share some of their clinical trial data with external investigators as well. In the past, that never happened, and I am hopeful this will make clinical trial data available to a wider group of scientists.”

One issue that may still pose problems going forward is privacy. Covitz concedes it is not only problematic, but is probably the single largest barrier to data sharing. Intellectual property protection is also an issue, but not as much of a concern.

“It's not going away,” adds Covitz. “The heterogeneity and the inconsistency of privacy laws and regulations make companies cautious. I think that is still fertile ground for regulatory improvement and consistency. Privacy protection is a legitimate concern, but there has to be a way for us to safely share data. Some of the most valuable data sets are from multi-site trials performed across different countries. That means subjects are from different jurisdictions where the laws are different. A lot of overhead gets added to a privacy protection program before you can release the data.”