Guest Column | May 18, 2020

Working With Two Consultants On One QMS Overhaul: What Could Go Wrong?

By Allan Marinelli, Quality Validation 360 Inc.

Trumpet Loud Music

Some companies in the pharmaceutical, biopharmaceutical, vaccine, and medical device industries mistakenly assume that if they hire multiple consulting firms to establish, manage, or overhaul a QMS (quality management system) and create an abundance of quality gates, it will inevitably result in an automatic licensure of their new blockbuster product at a low service cost. This article will discuss just such a scenario. Initially, the drug sponsor — in support of obtaining licensure of their single blockbuster product — hired a reputable consultation firm (quality consultant) to overhaul their QMS, change management system, Quality Operations, etc. Then, the sponsor hired another consulting firm (validation engineering consultant) to support the validation activities by writing computer system validation scripts, with an emphasis on data integrity, access control and security, user access levels/roles/privileges, disaster control and recovery scripts, and audit trails, and in addition to support equipment/process/cleaning validation activities.

We will review the interactions between these three parties at each phase of the project, evaluate what went wrong, and discuss ways to avoid similar problems when working with multiple consulting firms simultaneously.

Initial Review Phase

The drug sponsor had successfully demonstrated that its product was effective at curing patients with a particular disease in the clinical setting. However, the sponsor had failed in qualifying its manufacturing GMP (good manufacturing practice) setting. As a result, the sponsor hired a quality consultant (QC) to ensure the QMS at its site would successfully meet the GMP requirements.

The QC was given carte blanche by the sponsor to handle the entire QMS overhaul, including rebuilding the QMS and the company’s prime directives and culture, etc. Through an initial review, the QC identified many gaps or problems in the sponsor’s manufacturing and laboratory areas, including, but not limited to, the following:

  • Mold was visible in the manufacturing areas.
  • Adherence to gowning practices outlined in the SOP was inconsistent.
  • The gowning SOP did not adequately explain or provide ample pictorial visualizations on how to correctly gown to meet the written intent of the SOP.
  • The environment monitoring SOPs and validation protocols needed to be rewritten to meet current GMP standards rather than clinical standards.
  • The automated equipment validation protocols were insufficiently written to trace back to the user requirements specifications, functional specifications, and requirement traceability matrices. ICH Q9, GAMP 5 principles, and software design life cycle (SDLC) approaches were not used as reference materials to ensure control of the automated equipment.
  • The process validation protocols were insufficiently rationalized and were not statistically represented to account for conducting three runs.
  • The cleaning validation samples taken during performance of the cleaning validation runs were inconsistently executed, coupled with insufficient direction in the protocols, which led to high variance from operator to operator and run to run.
  • There was no segregation of duties when logging into manufacturing systems such as autoclaves, parts washer, bioreactor, centrifuge, AKTA, skidded chromatography systems, etc.
  • Incomplete tests were conducted to demonstrate the negative and positive permission conditions versus non-permission conditions in relation to user access levels, user roles relative to the manufacturing production equipment, and laboratory computerized instrument validations.
  • The IT department rarely acted to ensure that all computerized manufacturing equipment or laboratory instruments had an enforced backup/restore or disaster recovery program in place or a suitable mitigation place in lieu of such.
  • The IT department was confused as to how to interpret the backup/recovery of systems, servers, etc., since the SOPs were vague. As a result, the IT department representatives, based on their own time and priorities, would conduct backup upon their availability, while the method or tools used to perform the backups were inconsistently applied.
  • The audit trail and 21 CFR Part 11 functionality features, although available in some of the manufacturing equipment and laboratory instruments, were not enabled.
  • There was no configuration specifications document nor was there a setpoint configuration item list documented for all automated equipment and laboratory instruments.
  • Setpoints and configurations were being changed on the fly by various users (with no audit trail) without any documentation to attest to the initial settings and configurations of the system, nor was there any traceability of subsequent changes to the settings/configurations.

The QC did a great job in the initial phases of capturing the list of gaps to be rectified, but the firm was mostly known for its expertise in the QA (quality assurance) and QO (quality operations) area. Because of that, the sponsor decided that a validation engineering consultant (VEC) would be better suited to correctly writing, executing, and generating final reports in alignment with the validation plans, test plans, test cases, test scripts, test scenarios, protocols, etc. Consequently, the sponsor hired a VEC to complete the mission.

Computer System Validation Phase

As mentioned, a number of issues were identified relative to the computer system validation portions of the project. The VEC was tasked with writing validation plans, test plans, test cases, test scripts, test scenarios, and protocols. However, the QC was initially given the authority to approve the work written or conducted by the VEC, even though the VEC was in alignment with the sponsor during the writing process. This inevitably resulted in the documents going back and forth for many months during the pre-approval phase, when it should have taken only one week to develop these types of protocols.  The involvement of the QC resulted in the sponsor gaining very little momentum for a long period and continuing to experience inefficiency and ineffectiveness.

For example, pertaining to the autoclave, it should have taken a maximum of two weeks to compile, complete, and pre-approve the information about the security access levels, permissions/non-permissions, user roles/user rights, etc. In fact, however, this task was still not completed after four months, due to minor disagreements between the QC and the VEC. This was despite the fact that the sponsor had agreed with what the VEC initially proposed as a baseline or benchmark document for moving forward.

The sponsor had no initial document to attest to the security access levels for the autoclave, and the computerized systems fall under the ICH Q9, GAMP 5, and SDLC approaches. The disagreements between the QC and the VEC wasted many months, with lots of money being burned for little value – all while paying the consultants involved. The attempt to develop a perfect protocol to test the security access levels paradigms does not add much value in either the big picture or the quality picture, as there will always be a need for subsequent alterations to documents based on the SDLC approach. 

Equipment Automation Qualification Phase

The initial review phase discovered that the sponsor’s automated equipment validation protocols had many gaps and did not comply with ICH Q9, GAMP 5, and SDLC methodologies. It was not possible to determine the original setpoints and configurations of the automated pieces of equipment, as change control was never originally enforced and numerous engineering changes were completed without a paper trail. Essentially, the sponsor’s employees had been performing changes on the fly during undocumented troubleshooting sessions.

After the initial review phase was completed, an experienced automation validation engineer (AVE) representing the VEC’s firm efficiently authored many protocols. As part of the IQ (installation qualification), the configuration and setpoint parameters were carefully identified while taking into account GAMP 5, ICH Q9, and SDLC methodologies, coupled with the rewriting or drafting of SOPs for further alignment. To establish a satisfactory baseline, the SOPs were quickly approved by the relevant stakeholders from the sponsor, QC, and VEC prior to any pre-approvals of validation protocols.

After the baseline SOPs were approved, the protocols that were written by the experienced AVE were soon ready for formal pre-approval since all parties had previously scrutinized and commented on the protocols to their satisfaction. Most of the stakeholders pre-approved the list of protocols within two days after the relevant SOPs were approved, but one of the stakeholders from the QC decided not to pre-approve the protocols because of objections to the font type and some minor formatting of tables. While these did not violate any of the sponsor’s SOPs or validation templates, the AVE immediately adjusted the font types and the formatting of the tables as requested. Then, most of the stakeholders pre-approved the protocols again, but the same stakeholder from the QC decided to endorse subsequent changes to the entire content of the protocols while falsely claiming that the FDA would not accept this document as written, even though no justifiable evidence was provided.

When both the sponsor and the VEC questioned the stakeholder from the QC, they reiterated that they all were involved in generating the new protocols from the inception  and the comments from all relevant stakeholders had been incorporated. The representative from the QC subsequently replied that he had forgotten to include additional comments on the protocols. This process was repeated more than 15 times, as the representative from the QC kept creating additional comments. Meanwhile, other stakeholders, out of the blue, decided to jump on the band wagon and join the erroneous approach of the representative from the QC, adding more non-value-added comments with the aim of obtaining a “perfect” protocol (a perfect protocol is a fallacy, and for one to assume such protocol exists is to live in a world of make believe).

The protocols written by the representative from the VEC to support the sponsor were originally developed by factoring in similar protocols written to accommodate other clients, and that were previously FDA inspected and accepted. What should have been a relatively simple exercise had now become a cumbersome, months-long activity to get even a single protocol pre-approved. On top of that, the consultants ended up overcharging the sponsor by at least twenty-fold for the cost of their services. Despite all this turmoil, the sponsor was finally able to manufacture its blockbuster product, as it subsequently received FDA approval to manufacture.

Conclusion

The lessons of the nightmare scenario described in this article highlight the importance of implementing three best practices when involving multiple consultants in a project: (1) increase collaboration among all parties; (2) each party must respect the technical inputs of all individuals while envisioning the corresponding output for every input, whether it  was derived and provided by a representative of that party or not; and (3) accept the best technical input possible with the least amount of spending on the project, without compromising the quality paradigms.

About The Author:

AllanAllan Marinelli acquired over 25 years of worldwide cGMP experience in Belgium, France, South Korea, China, India, Canada, and the U.S. under the FDA, EMA, SFDA, KFDA, WHO, and other regulators. He is currently the president/CEO of Quality Validation 360 Inc., providing consultation services to the (bio)pharmaceutical, medical device, and vaccine industries. Marinelli has authored peer-reviewed papers (Institute of Validation Technology), and chapters on validation (computer systems, information technology/infrastructure, equipment/process, etc.), risk analysis, environmental monitoring, and cleaning validation in PDA books. He is an associate member of ASQ (American Society for Quality) and a member of ISPE. Marinelli has provided comments to  ISPE’s Engineering Baseline guideline documents, including a recent review of the new draft ISPE Good Practice Guide: Data Integrity by Design, projected to be published in 2020. You can contact him at amarinelli360@gmail.com and connect with him on LinkedIn.