Magazine Article | March 8, 2017

Strategy For Early-Phase API Development

Source: Life Science Leader

By Sriram Naganathan, senior director of chemical development, Dermira

Most often, the delay in initiating human clinical trials is caused by the unavailability of suitable drug. Therefore, there is great pressure on the chemical development group to not be the limiting factor and ensure sufficient drug is available in a timely manner. As the drug development advances into later phases, drug supply is usually less of a constraint, because the physicochemical properties of the drug are better understood, and a reliable supply chain begins to take shape.

There are multiple best practices for rapid filing of an investigational new drug (IND) application and initiating clinical trials. But there are only a few trusted principles that ensure the chemical development group is not a barrier to the drug supply.

“PERFECT IS THE ENEMY OF THE GOOD”
When planning for the initial delivery of clinical trial material (both drug substance and drug product), the focus should be on the delivery of enough material to cover the Phase 1 trial and also initiate Phase 2 trials if needed. Along the way, material needs for formulations development may be fulfilled through judicious planning of batch size and production schedule. If the clinical program is successful, there will be sufficient time and resources to develop an elegant and cost-effective commercial process. At this stage, a process that can be scaled to produce a consistent and predictable quality of the drug substance is entirely adequate.

ONE BATCH VS. TWO BATCHES
Initially, the drug substance requirements are generally small and limited to conducting nonclinical toxicological studies, which enable the initiation of Phase 1 human trials. Drug substance needs for Phase 1 trials tend to be a few kilograms to a few tens of kilograms and usually consist of the entire quantity of the available world’s supply of the drug! Thus, the debate over whether the chemical development group makes one batch of the drug substance for use in both the nonclinical toxicology studies and Phase 1 trials, or adopts the two-batch strategy by making a smaller batch for the toxicology studies followed by a batch manufactured under cGMP for Phase 1 trials. Obviously, each approach has its advantage and associated risk.

"In our experience, the two-batch strategy has been employed successfully in over 20 programs."

The one-batch strategy is a linear process, and once manufactured, the materials can be engaged in the development activities without any risk or interruption. However, all chemical development and manufacturing have to be completed prior to the start of any dosing, human or animal, and could last about six to 12 months for a typical small molecule. Adding the customary six months it takes to obtain reports from INDenabling toxicology studies, time for first-in-human dosing could be 12 to 18 months from nomination of a clinical candidate.

Employing a two-batch strategy is likely to be considerably faster because several activities happen in parallel. First, a smaller batch of drug substance is prepared and used to initiate the toxicology studies, typically three to six months from nomination of the candidate. In the six months it takes to conduct the toxicology studies, the process is refined sufficiently to enable the cGMP manufacture of the batch to be used in Phase 1 trials (commonly referred to as GMP-1), just in time for the IND application to become effective. In this approach, the clinical trials start nine to 12 months from candidate nomination.

In our experience, the two-batch strategy has been employed successfully in over 20 programs, resulting in a median time of 11 months from candidate nomination to being ready to dose humans.

THE CHALLENGES
The cornerstone of chemical development is ensuring that results obtained in clinical and nonclinical studies using every batch of drug substance can be connected to a past and a future batch — it is not as important to have the highest purity levels from the outset as it is to have progressively lower levels of the same set of impurities in every successive batch, or to have eliminated most of them altogether. Every impurity above a certain level must be qualified in toxicological studies. The appearance of a yet-unseen impurity would be a significant problem, and potentially cause delays to clinical studies.

The majority of challenges during early process development and synthesis fall into the following areas:

IMPURITY CONTROL
It is difficult to manage the impurities resulting from starting materials early in development, especially when the route of synthesis is still under development. In fact, the starting materials may themselves be under development, let alone the process for their manufacture. The most common way to ensure that starting materials do not contribute to different impurity profiles in the two-batch strategy is to employ the same batch of key starting materials for the toxicology and the clinical batches.

Better yet, if a route of synthesis can be established early, the manufacture of advanced intermediates can be undertaken at a CDMO while process development is ongoing. In some cases, we have utilized the synthesis of an advanced intermediate by a CDMO to evaluate its suitability for the manufacture of the drug substance, especially if it possesses capability to operate under cGMP.

Ultimately, avoiding impurities cropping up unexpectedly depends on the quality of the analytical method (see below). As the chemical process gets refined and the side reactions brought under control, once-insignificant, or even once-invisible, impurities become prominent.

SALT SELECTION AND POLYMORPHISM
A form (polymorph) screen should be embarked upon as early as the drug candidate is identified. As most drug candidates have basic or acidic functions, a salt is appropriate for control of both a robust solid-state form and also as a control point for achieving impurity rejection. Having said that, one should not dismiss proceeding with a freebase (or free-acid) as the final form being developed. There have been many instances where a salt really does not add much to the bioavailability or stability compared to the freebase. However, the formation and breaking of the salt provide an important stage for purification. Quite often there may be more than one viable salt or polymorph, and the final form for development may not reveal itself well into the development process. So, the toxicological studies will have to be initiated with one salt (form) and then a different salt (form) may be preferred for human trials. This switch requires a bridging toxicology/ pharmacokinetic study.

STABILITY & ANALYTICAL METHOD DEVELOPMENT
Development of analytical methods and controls to assess impurities, forms, and stability (degradation) of the drug substance must be undertaken early in the development. This is especially true for the development of stability-indicating methods and obtaining early indication of the stability of the selected form of the drug substance. This is one aspect of development that cannot be mitigated by throwing more resources at it! A six-month stability study takes six months no matter how hard one tries to shorten that duration.

CDMO SELECTION
Working with a compressed timeline means that several activities are happening simultaneously, and judicious selection of a CDMO becomes very important — including one-stop vs. several specialty experts. While a one-stop provider may understand the contexts and timelines better, it may not have the insight and expertise into specialized problems (e.g. polymorph-related issues). The most important characteristics are adaptability to rapidly changing conditions, technical expertise, and ability to communicate clearly in a timely manner.