Magazine Article | April 2, 2013

A Successful Externalization Strategy Demands Modernized Informatics

Source: Life Science Leader

By Matt Hahn, Ph.D., senior VP and CTO, Accelrys

In an effort to cut costs, take advantage of specialized expertise, or conduct key stages of drug development closer to emerging markets, pharma companies are increasingly outsourcing more activities to an extensive array of contractors. This complexity opens the door to errors, rework, product delays, and compliance issues.

As externalization impacts more tasks across the drug development value chain, will the benefits gained continue to outweigh the risks? Maximizing the opportunities presented by the practice while also maintaining high levels of product innovation, quality, and safety is closely linked to smart management of a project’s most valuable commodity: its data. This demands a modern, flexible, and holistic approach to the capture, control, and sharing of information that drives innovation. Here are three objectives that should be on your company’s data management short list:

1. Integrated informatics. Bringing a new drug to market is a complex undertaking, more so when several contract partners and collaborators are involved. It’s important that the flow of information across the entire “scientific innovation life cycle” (from lead discovery and earlystage research at the front end, all the way through safety testing, QA/QC, and production scale-up) be well-coordinated, efficient, and also closely linked with the systems and stakeholders responsible for later stage manufacturing and distribution. The problem is that, all too often, the data technologies and process management procedures used by various stakeholders (from business execs, to CRO and CMO partners, to individual scientists, engineers, lab technicians, and other experts) are disjointed and disconnected — separated by system, organizational, disciplinary, and geographic boundaries. This reality leads to information visibility “gaps” that can cause product development delays, invite errors, and impede collaboration. To close these gaps, an informatics approach that prioritizes the integration of data and processes across the end-to-end scientific innovation life cycle (and beyond) is critical. It is no longer acceptable to allow needed information to remain hidden away in “silos.”

2. Consistent data capture. In an environment where a single incorrectly reported balance measure can result in a compliance violation or production shutdown, the adage “garbage in/garbage out” is apt. Companies must ensure that all project data is captured in a consistent, transparent, and traceable manner, regardless of whether it was generated by an in-house scientist, a CRO chemist, or a processing engineer working for a manufacturing partner. Weeding out paper-based and manual data entry practices is an essential first step. When possible, data should be captured automatically and electronically, direct from the lab instruments and other equipment used to conduct research. Maintaining standards for naming and tracking intellectual property is also important. This means registering and assigning unique identifiers to every molecule, chemical ingredient, formulation, cell line, and so on, so that information relevant to project progress, regulatory compliance, patent filing, and more can be found quickly and easily, wherever it is located.

3. Simple, affordable, and secure collaboration. There are considerations that take on added importance when informatics technologies are extended beyond the corporate firewall. These include cost, ease of use, and security, as well as the interests of specific in-house stakeholders. An open collaborative solution championed by end users and IT groups (because of the productivity improvements or cost savings that could be gained) may not be OK with the legal department, if it is believed that the integrity of competitive corporate data may be compromised. On the other hand, a tightly controlled approach (such as deploying redundant information systems at outsourcers’ physical sites) may be too costly, cumbersome, and inefficient. When it comes to collaborative technologies, flexibility is key. This is where cloud-based solutions offer a compelling opportunity: Organizations can create hosted data-exchange portals in the cloud that are easy to access via a Web browser from multiple locations. They must be simple and affordable to scale as new partners come online and are flexible enough to set up varying degrees of data access. The attractive part of this approach is that organizations can keep their sensitive data secure on-premise, while using the cloud to collaborate.