Magazine Article | October 2, 2014

Practical Case Studies In Data Analytics

Source: Life Science Leader

By George Brunner, CTO, Acumen Analytics

The following is part two of a three-part series depicting the personal journey of a fictional CEO as he develops a strategy, using the data his company generates and consumes, to shorten his drug development timeline and predict and prevent problems before they occur.

As you may recall from last month, Cabot Harrington, CEO of a midsize pharmaceutical company called Helioarc, decided to leverage his data more effectively. This month, Harrington went to visit some of his pharma colleagues to see specific examples of how data analytics were being used.

His first visit was to a pharma company similar in size to Helioarc; he heard about their efforts to develop a modern data repository. This pharmaceutical firm began a project in 2012 with the goal of uncovering ways to design better clinical studies and learn from the past, get earlier results in the hands of principal investigators, and access data more efficiently.

VIEW STUDIES ACROSS CROs
Their new data repository allowed them to view discovery and clinical studies across all their CROs, in addition to their own organization. When data streams in from external CROs, it is often incompatible. “Rarely,” said the CTO, “do CROs have the same applications as we do, and many are unaccustomed to delivering data in the form required by us.”

VIEW STUDIES FROM THE PAST
Another focus of the project was to improve the pharma company’s efficiency on clinical studies by giving them access to the rich historical, often forgotten, data from older studies. Companies are missing opportunities; if you have the ability to re-execute studies, searching by compound or compound type, to find connections between studies, it can result in fresh insights.

His colleagues spoke about a scenario (part of a study done a few years ago) where extensive information existed on how one particular molecule had reacted and performed. That data ended up as invaluable to a researcher considering the molecule for another application, so he did not need to begin from scratch.

They showed Harrington the solution that had been architected, which allowed cross-study analysis of subjects, compounds, and compared results of older study results with current study goals. It was very visual and easy to navigate.

CHALLENGES OF DATA INTEGRATION
But getting to the current state wasn’t so easy. Harrington learned that the first few months of this data analytics project had been difficult. To set up the new data repository, data from four different systems — toxicology, hematology, pathology, and electronic lab notebooks — had to be accessed. The employees involved rarely shared systems and were now being asked to submit their data — current and future — to a central data repository.

Harrington’s colleagues emphasized the importance of talking to every member of the team and including them in developing the data analytics project goals. One SVP said that he knew it would finally be OK when he emphasized to the teams that one of the key goals of the project was that their work would “never be lost.” The concept of hard work being kept alive for future possibilities definitely resonated with his teams.

LEGACY SYSTEM PROVIDERS: NEUTRAL ENOUGH?
The pharmaceutical company executives also told Harrington the value of hiring an objective data analytics partner to help envision, and then implement, the new system. Legacy service providers, even those who claim an expertise in life sciences, may not have the ability to neutrally evaluate the options, simply because they are tied to their own system solution.

"If a study is exceeding expectations, that is critical information to have at the earliest stage possible, given today’s landscape of accelerated approval status, fast track, etc."

REAL-TIME ACCESS ON CLINICAL STUDIES
Another key part of the project was to gain real-time access to ongoing clinical studies. Why wait until a study is complete or, indeed, wait for even interim reports if data can be accessed in real time, and potentially indicate early positive or negative outcomes? If a study is exceeding expectations, that is critical information to have at the earliest stage possible, given today’s landscape of accelerated approval status, fast track, etc.

PUSH, NOT PULL
One other important goal for the study was to implement “push” technology. Data interactions between a pharma company and its CRO should be designed so that the pharma teams no longer have to “pull” the data from the CRO or wait for written reports. Data should be “pushed” from the CRO directly into the hands of the clinical research team, in real time, ready for viewing at any time.

It was clear that, after a year of effort, the company was able to analyze and achieve actionable insights from this information.

  • Scientists designing new studies now have the ability to review information from older studies for possible re-use of data or insights and potentially predict future success more accurately.
  • Access to study results is accessible in real time; there is no longer a need to wait until after the study is completed.
  • There is a single repository for standardized study data and integration for faster insight.
  • Data quality and availability has improved significantly.
  • Data is more easily and consistently integrated across multiple CROs.
  • The speed and quality of data received from CROs was improved.

Over the next few weeks, Harrington visited and spoke to other colleagues in the industry to find out more about how he should structure his Big Data initiative.

COMPLIANCE DATA: DIFFERENT DATA, SAME SOLUTION
One compliance-related story resonated strongly, because it solved one of Helioarc’s biggest problems — wasting time locating information for regulatory agencies such as the FDA, EPA, and USDA.

Harrington’s colleague shared that his experience had been initiated when a chemical spill incident focused (negative) attention on the fact that his organization could not “put its hands on the information required” to quickly assess the spill situation. From this, the company began to review all compliance- driven inventories, data flows, controls, and processes. The company ended up focusing on cross-site chemical inventory, because it had seen just how risky not having full visibility to this information could be.

The solution suggested by its IT partner, a data analytics company hired after the chemical spill, was to develop an integrated data repository designed for maximum analytical performance. As the CIO explained, that data repository gave employees/users one access point and much greater accuracy when looking to identify a potential solution for a compliance-driven problem.

Various data sources fed into the chemicals inventory, including vendor data, chemical ordering databases, bidding documents, accounting and financial info, laboratory chemical process records, and R&D request forms. Harrington was beginning to realize that data could be viewed as an asset, and that data repositories were the new “online banking” for those assets.

The IT partner helped scrub the data, which was a time-consuming process, but achieving clean and concise data is critical. It meant interaction with each of the data “owners” over many months, including the procurement department, clinical discovery staff, and principal investigative staff, to clearly map the current process for all data tied to the chemicals’ life cycle.

Compliance has improved in the chemical tracking. With a new single data repository, plus processes in place to input more accurate data, reporting and analysis have been simplified and compliance procedures improved. And there is now a “single version of truth” regarding the chemicals inventory data.