By Amy Furlong
All new pharmaceutical products must go through comprehensive testing to thoroughly assess their effect on patients before being approved for release to the market. Electrocardiogram (ECG) data collection and analysis serves a vital role within clinical trials as it is used to assess the effect of investigational drugs on the electrical functions of the heart. In recent years, cardiac safety concerns have seen many drugs being refused regulatory approval or being withdrawn from the market, which highlights the need for reliable accurate collection and interpretation of ECG data during the clinical trial process. However, the collection, analysis, and interpretation of this data is a complex process and a topic of much debate within the pharmaceutical industry.
Why ECGs Are Important
Recent concerns over the cardiac effects of new pharmaceutical products have led to greater regulatory scrutiny for all new compounds and drugs, as well as a greater awareness from pharmaceutical companies with regards to the potential effects their products may have on cardiac safety. While there is no regulatory mandate in relation to ECG assessment across all clinical trials, the requirement for new compounds, with limited exceptions, to conduct a Thorough ECG Trial (TET) has been mandated by the FDA. This is a result of the finalization of the ICH E14 guidelines which were developed to assess QT/QTc prolongation in new drugs to determine cardiac safety risks.
The ICH E14 guidelines recommend that a TET should be performed, and if any cardiac concerns are raised, Phase III trials will require more robust or intense ECG collection. Unlike many other clinical trials, the TET typically uses a centralized system, which has been shown to greatly improve the accuracy and reliability of ECG data in clinical trials. A centralized approach uses a core laboratory which supplies digital ECG machines to the monitoring sites and then analyzes the data.
The Limitations Of Decentralization
Traditional ECG data-gathering methods use a decentralized model. A decentralized ECG system is typically carried out across multiple investigator sites using local ECG machines. However, there are some significant downfalls to this approach. The use of different machine types at different sites means that interval duration measurement (IDM) data is often inconsistent, as not all machines use the same algorithms for calculating the resultant data. The over read of the ECG output is not consistently analyzed across investigational sites, and additional fees are required for professional cardiologist over reads on a site-by-site basis. Data is typically collected at the beginning and end of the trial, which provides very little info about the cardiac effects of the chemical being studied.
When using a decentralized approach, a typical ECG machine generates a paper printout of the IDM data, which must then be transcribed before analysis. However, transcription errors can create further inconsistencies in ECG results. Other unexpected results can occur due to differences in the site investigator’s interpretation of the results.
A centralized approach uses digital ECGs and a core laboratory which handles much of the work done by clinical trial sponsors, CROs, and individual monitoring sites. The core lab typically provides all ECG equipment to investigator sites. This means that the core laboratory can ensure that standard equipment is distributed that has been tested to full functionality and is programmed for the correct demography capture for the study.
The use of digital ECG data collection at the core laboratory speeds up the reading process and returns much cleaner data by eliminating transcription and misinterpretation errors which are commonplace with a decentralized approach. Some core laboratories have systems in place that automatically check for missing visits or any changes in demography. Each ECG is evaluated by a qualified cardiologist at the core laboratory to ensure data quality, integrity, and consistency.
The FDA recommends, in the ICH E14 document, that centralization should be used where cardiac concerns are raised, which highlights the improved results generated by this approach. However, approximately 2/3 of ECGs collected in clinical trials are still obtained using traditional decentralized paper methods. The reason for a continued use of a decentralized system is partly due to lack of regulatory mandate in this area and the misconception that centralized systems are more costly.
The True Cost Of Improved ECG Clarity
The perception that centralization costs sponsors more money is a major reason for the continued use of a decentralized approach. However, there is much debate about estimating the true cost of centralized versus decentralized ECGs.
First, the difficulty in quantifying the number of ECGs that will be required in advance of a study program makes estimating the true cost of centralized versus decentralized ECGs very difficult — as staffing costs, the number of investigator sites, and the number of ECG machines needed is unknown and varies based on study design.
Second, much of the collection, transcription, and interpretation of ECG data tasks are carried out by the sponsor and the individual monitoring site when using a decentralized model, which means that many companies using this approach see the use of a core laboratory in a centralized system as an unnecessary additional expense. However, the added value of digital collection, improved accuracy, and reliability can actually help sponsors reduce costs. By eliminating errors in data collection and transcription of ECG data, sponsors reduce the amount of retesting that must be carried out.
Additional costs include fees that are paid to the site in the decentralized model, which include a technical fee for the ECG acquisition and a professional fee for evaluation of the ECG. These fees are based on standard medical reimbursement rates (CPT codes). In many settings where a qualified cardiologist is not available, the site needs to employ the necessary qualified expertise, which can generally cost from $75 to $250 per ECG. The centralization of ECG analysis eliminates these unnecessary over read fees and also allows a reduction in the standard site fee payments for the technical fees. A comparison is analysis of blood tests in clinical trials that originally were decentralized and required the management of site-speciifc laboratories with multiple normal ranges, data quality issues, and high costs due to site-specific requirements. All blood work is now performed in a centralized manner by experienced core laboratories which provide an efficient, consistent, and high-quality service.
A further cost associated with centralized ECG trials is in relation to the rental, storage, and shipping of the ECG machines to each investigator site. A typical ECG machine can weigh between 7 and 10 pounds and can be of substantial size, which means they can be expensive to transport and store. In addition, the maneuvering and preparing of the machines to get them ready for use can be time-consuming and difficult for inexperienced users. The average rental cost of such a machine generally varies between $100 and $150 per month. Reducing the acquisition fee — which includes the amount of rental paid for the ECG instrumentation — is one way of lowering costs. The use of centralized equipment is an integral feature of a core laboratory, and therefore, the sponsor should not technically have to pay extra for machine rental.
What The Future Holds For Centralization
Although the benefits of centralization give clear advantages over the decentralized model, there is still a need for innovative new instruments that can help sponsors increase accuracy, reliability, and cost-effectiveness, while overcoming the perceived challenges of decentralization. As demand for improved accuracy and reliability increases, there is a clear need for innovative new instruments that can help sponsors achieve these goals.
The issue of large, heavy, and expensive instrumentation could be tackled with the introduction of highly compact ECG machines that have a much smaller footprint than existing systems. A number of new ECG machines on the market have already made several strides towards this goal, producing highly compact instrumentation, which are a fraction of the size of traditional machines and still provide full ECG functionality. These smaller machines are easy to maneuver and are less expensive to ship and store.
Advances in software allow these new instruments to integrate into existing computer systems, allowing key data, such as demographics and algorithms, to be automatically downloaded before a trial. The ability to download this data is a substantial benefit in terms of both staff time and cost, especially to sponsors involved in studies involving noncardiac drugs where the investigator site is not familiar with ECG systems.
Traditional ECG machines produce a paper printout of all the key ECG data, which is then transcribed and the results analyzed. However, errors are common during transcription, which leads to inaccurate results and has a detrimental effect on the overall validity of the trial results. Eliminating the need for this printout by enabling the machines to upload the data directly onto the laboratory computer system would avoid the possibility of transcription errors and increase the overall accuracy and timeliness of the data, as well as saving staff time and cost in the process.
Increasingly, regulators are requesting that studies submit ECG data to a central digital system, also known as a data warehouse, to assist with regulatory inspections. All data stored on the system can then be accessed by regulators to quickly and efficiently analyze the quality of the data. Even though this is not a mandatory requirement as yet, most clinical trial sponsors are currently complying with this request. A centralized ECG system makes this request easy to comply with, as all data is already stored centrally and simply has to be transferred to the database as required.
The Challenges Of This Approach
Although there are currently no plans to enforce centralization as an industry standard, the ICH E14 guidance highlights centralization as a more robust means of ECG data collection. While the improvements to data quality, as well as reduced workload for sponsors and investigator sites, are evident when using a centralized ECG system, demonstrating the true added value of this approach remains a challenge. However, it is clear that when all the costs are added up, centralization has benefits that cannot be matched by decentralized systems.
By leveraging new technology and ECG solutions that are designed to integrate easily into existing systems, clinical trials sponsors can benefit from a more efficient and cost-effective system. New instrumentation that is currently being developed will help clinical trials sponsors ensure future patient safety and regulatory compliance by reducing site burden and increasing accuracy, reliability, and usability. Such developments with regards to new equipment will help to further streamline the ECG collection and analysis process and overcome many of the perceived challenges of centralized systems.
About The Author
Amy Furlong is executive VP of cardiac safety regulations at eResearchTechnology (ERT). She holds a bachelor’s degree in biology and a master’s degree in quality assurance and regulatory affairs. She has more than 10 years of clinical research experience, specializing in regulatory compliance and computer system validation.