Magazine Article | July 1, 2020

AI & The Global Data Quality Imperative

Source: Life Science Leader

By Steve Gens & Remco Munnik

Steve Gens & Remco Munnik
Life sciences companies are well aware of the huge potential of emerging technologies, including artificial intelligence/machine learning (AI/ML), for transforming the efficiency and efficacy of regulatory submissions and ongoing license management.

The potential for process innovation is considerable, from automated document preparation to the introduction of timely prompts and alerts when submission/renewal deadlines are approached, or required parameters have not been met. However, in their enthusiasm to harness such options, companies can seriously underestimate the work required to improve the quality and governance of the underlying data such systems will be working from. Here are five data governance prerequisites for AI-enabled transformation of regulatory information management and content preparation.

1. DEDICATED ROLES & RESPONSIBILITY AROUND DATA QUALITY

Data quality sustainability should be an organization-wide concern, necessitating a culture of quality and clear accountability for this as part of people’s roles — as appropriate. Allocated responsibilities should ideally include:

  • Quality control analysis. Someone who regularly reviews the data for errors, for example, sampling registration data to see how accurate and complete it is.
  • Data scientist. Someone who works with the data, connecting it with other sources or activities (e.g., linking the company’s regulatory information management [RIM] system into clinical or ERP systems, with the aim of enabling something greater than the sum of the parts, such as “big picture” analytics).
  • Chief data officer. Someone with a strategic overview across key company data sources. This person is responsible for overseeing governance, standards, and investments in enterprise information assets — including ERP, RIM, and safety systems — to ensure the data they contain is reliable, accurate, and complete and remains so over time.

2. A QUALITY CONTROL ROUTINE

To steadily build confidence and trust in data, it is important to establish good data hygiene practices and build these into everyday processes. In so doing, companies can avoid the high costs and delays caused by data remediation exercises, which can run into millions of dollars or euros. Spending just a fraction of that amount on implementing good data habits and assigning dedicated resources are cost-effective and will pay dividends in the long term.

Operationalizing data quality standards is important. This includes making sure that teams across all parts of the organization use the same names, references, and agreed formats when entering or amending data, and that they do not leave fields blank or incomplete. Standards should also encompass requirements to link data with related documents, such as registration status information and any pertinent correspondence with the authorities.

Not all data quality errors are equal in their potential impact, so it should be possible to flag serious issues for urgent action and track the origin of errors. Doing so will help with additional training and support. Making data-quality performance visible can be a useful motivator by drawing attention to where efforts to improve data quality are paying off. This is critical for our next point.

3. RECOGNITION & REWARDS SYSTEMS

Everyone likes to be appreciated for their efforts, so it is important to recognize people/teams/countries/regions that have made the biggest transformations in their data quality. Transparent recognition will continue to inspire good performance, accelerate improvements, and solidify best practices, which can be readily replicated across the global organization to achieve a state of continuous learning and improvement.

"Companies cannot confidently innovate with AI and process automation based on data that is not properly governed."

Knowing what “good” looks like, and establishing KPIs that can be measured against it, are important, too. People who are assigned to be responsible for data quality should have their performance measured via job appraisals/reviews and be rewarded for improvements.

4. A MATURE & DISCIPLINED CONTINUOUS IMPROVEMENT PROGRAM

According to 2018 research from Gens & Associates, life sciences companies with a regulatory continuous improvement program (CIP) have 15 percent higher data confidence levels, 17 percent are more likely to have achieved real-time information reporting, and 21 percent have higher efficiency ratings for key RIM capabilities.

Continuous improvement is both an organizational process and a mindset. It requires progress to be clearly measured and outcomes tied to business benefits. As the management consultant Peter Drucker famously said, “If you can’t measure [something], you can’t improve it.” A successful CIP in regulatory data management combines anecdotal evidence of the value that can be achieved and clear KPIs (e.g., cycle time, quality, volume) that teams can aim toward and be measured against.

At its core, continuous improvement is a learning process that requires experimentation with incremental improvements. We recommend collating multiple ideas from across the organization, performing root-cause analysis, and agreeing on KPIs that help people focus on the main priorities for change.

Establishing good governance and measuring for and reporting on improvements and net gains and how these were achieved (what resources were allocated, what changes were made, and what impact this has had) will be important, too.

5. DATA STANDARDS MANAGEMENT

Intensifying international regulatory and safety initiatives is creating whole new rafts of specifications about how data should be captured, categorized, formulated, and applied. All of this is being done to create greater harmony in information handling and comparable product insights within organizations and across global markets.

Too often today, data is not aligned, and standards vary or simply do not exist. The result is the right hand doesn’t know what the left is doing. Ask representatives from regulatory, pharmacovigilance, supply chain, and quality how they define a product or how many products their company has, and no two answers will be the same.

The more that all companies keep to the same regimes and rules, the easier it will be to trust data and what it says about companies and their products. In this scenario, it becomes easier to view, compare, interrogate, and understand who is doing what, and how, at a community level.

Evolving international standards such as ISO IDMP (Identification of Medicinal Products) and SPOR (Substance, Product, Organization and Referential) means that companies face having to add and change the data they are capturing over time. To stay ahead of the curve, minimize the impact of changes, and avoid the risk of noncompliance. Life sciences companies need a sustainable way to keep track of what’s coming and a plan for adapting to and managing new requirements.

Delegating this responsibility to persons responsible for quality is likely to be unsuccessful, as there is so much detail to keep track of. Regulatory specialists, on the other hand, may understand the broad spectrum of needs but not how to optimize data preparation for the broader benefit of the business. That could be harnessing data standardization initiatives under IDMP, to simultaneously create a robust data bedrock for AI-based analytics and/or intelligent process automation, for instance. This may be where organizations have to seek external help, i.e., with how to strike the optimal balance between regulatory duty and strategic ambition.

FUTURE AI POTENTIAL DEPENDS ON DATA QUALITY SUSTAINABILITY INVESTMENT TODAY

The important takeaway from all of this is that companies cannot confidently innovate with AI and process automation based on data that is not properly governed. With the potential of emerging technologies advancing all the time, it is incumbent on organizations to formalize their data-quality governance and improve their ongoing data hygiene practices now, so they are ready to capitalize on AI-enabled process transformation when everything else is aligned.

STEVE GENS is the managing partner of Gens & Associates, a life sciences consulting firm.

REMCO MUNNIK is associate director at Iperion Life Sciences Consultancy.