Magazine Article | March 4, 2013

Making Quality A Priority In Clinical Trials

Source: Life Science Leader

By Abraham Gutman, founder, AG Mednet

The words “quality” and “compliance” are ubiquitous in the realm of imageintensive clinical trials. Yet, when it comes to implementing methods to improve and maintain data quality levels, we often concentrate on ways to correct mistakes instead of systems and tools to avoid issues in the first place.

Over the last several years, the industry has developed standardized metrics to measure efficiency, and a subset of those target the speed with which image data is submitted and quality controlled, as well as the speed with which queries are generated and resolved. Core labs typically check 100% of the data submitted, applying significant resources and time checking the completeness of submitted data, and identifying discrepancies between data elements (e.g. images and case report forms), variations from stipulated image acquisition protocols, or the presence of private health information in the data set.

Fast Data Doesn’t Mean Quality Data
Investigator sites collate the data they have to send and try to follow the instructions that were given during the initial investigator meetings that took place at the outset of the clinical study. These instructions are complex to begin with. However, when you consider that the person attending the investigator meeting is often not the same person sending the data a few months later, the notion that the sender would assemble the submission following the complex requirements of the study is not always realistic.

Trial coordinators are typically healthcare providers whose primary role is patient care, and the activities required to submit data to an important clinical trial never rise to the level of importance of treating the people under their care. Given the environment, the fact that image-based submissions have discrepancies and generate queries is not surprising. What is surprising is that the best we can do is provide them with systems that allow them to send what they have as fast as possible, so that labs can find the inherent problems quickly and generate and resolve the inevitable resulting queries. This has a side effect: Investigators are bombarded with queries about their data, generating more work on which they can’t focus, because they have to take care of their patients. In the best of cases, this vicious cycle leads to delays and cost overruns, and in the worst, it leads to the potential loss of subjects in the study.

Setting Quality Standards at the Site
The issue of data quality in imaging trials must be confronted as close to the point of origin as possible, and that means at the sites. Many imaging trial submissions still rely on paper case report forms and physical media. In some cases, we have been hiding behind a perception that sites don’t want to adopt tools to improve data quality prior to submission, and that we should simply acquiesce under the guise of not wanting to inconvenience them. The idea that sites like the status quo is misguided at best. Perhaps what sites don’t want are tools they are forced to use but bring no benefit to them, only to those downstream in the process (e.g. simple electronic transfer). It is clear that these dedicated professionals want to do their best, with as few data clarification requests as possible. Error avoidance is better than error correction for all parties, especially the sites.

For that reason, I believe that we have an obligation to provide site coordinators with the necessary tools to avoid errors in data submission assemblies and to check for errors in the image data they are receiving from radiology departments and imaging centers. For maximum effectiveness and impact, these tools should automate the majority of the QC steps on the data being prepared for submission a priori, rather than after it’s been received at the imaging core lab. Just as with the advent of EDC (electronic data collection), the up-front approach to data quality assurance for imaging submissions will dramatically increase protocol compliance, reduce discrepancies, and prevent future errors from taking place.

The most successful industries check the complex elements that make up their products, before those elements are sent for final assembly. Quality checks are done in smaller samples and not on 100% of every subassembly that is received. We must adopt this view if we are serious about reducing the cost and delays associated with bringing drugs to market.