By Ulf Willen & Paul Kippax
Analysis has a pivotal role in the ongoing transformation of the pharmaceutical industry. The vision of a knowledge-driven, risk-based approach to process and product development and manufacture points the way to safer, more efficient production and, indeed, greater commercial success. However, achieving this goal — transferring to continuous production and, ultimately, real-time release — will require effective process understanding and monitoring as well as greater product understanding. Both of which rely on the application of appropriate analytical technologies. This article focuses on particulate analysis, looking at how three important techniques — laser diffraction particle size analysis, image analysis, and chemical imaging — work in concert to support pharmaceutical aims throughout the product cycle.
Applying analytical techniques capable of delivering pertinent information at each stage of the product cycle is a major consideration within the pharmaceutical industry. In the early stages of a project, researchers require screening tools that rapidly narrow the field of drug candidates, but, as development progresses, more information-rich analysis is needed for detailed study. Quality by design (QbD) encourages the development team to identify and quantify correlations between key product variables and clinical performance and to understand how these are controlled by the manufacturing process. This leads to the more effective setting of product and manufacturing specifications, but intensifies the need for knowledge and for instruments that provide a cost-effective route to acquiring this knowledge.
Through pilot plant development and into manufacture, measurement speed becomes critical as the focus switches to tracking process dynamics and to continuous monitoring. At the pilot stage, real-time process analytical technologies (PAT) provide data that accelerate process scoping and optimization, as promoted by QbD. During production these tools allow manufacturers to consistently meet product quality targets and reduce reliance on prerelease QC (quality control) testing. A continuous data stream is ideal for troubleshooting, allowing operators to rapidly identify a problem and its cause, and for automated process control.
Finally, in QC the analytical demands are different again. Here, simple, automated, and highly differentiating measurement is required. Counterfeit detection, part of the overall quality control process, relies on sensitive instrumentation that can quickly identify rogue products.
Clearly the analytical tools employed at different stages will vary, but transitions from one technique to another can be complicated in terms of maintaining a consistent specification. Technologies that can be configured for both laboratory and process use, or that neatly dovetail, offer advantages.
Particle Characterization Needs
Almost all pharmaceutical actives are delivered in solid or particulate form or pass through one or the other during manufacture. This makes particle characterization a core activity in ensuring that appropriate specifications are set and maintained at every stage, from incoming raw materials through intermediates and into the final product. Finding the best analytical solutions demands consideration of which variables most accurately reflect product and process performance and which techniques support formulation, process development, manufacture, and QC.
Particle size is a critical quality attribute (a variable having direct impact on product performance and/or clinical efficacy) for many pharmaceuticals. This is recognized in ICH (International Conference on Harmonization) topic Q6A, which recommends particle size measurement for both solid dosage forms and liquids containing undissolved drug (suspensions) where size has an influence on:
dissolution, solubility, or bioavailability
product content uniformity.
For solid dosage forms, particle size is often tailored to ensure defined dissolution behavior and to control in vivo release rate. With inhaled drugs, the link with bioavailability is even more direct since particle size strongly influences deposition behavior in the lungs or nasal passages. With respect to processability, direct compression tableting provides an example of the importance of particle size, which can affect blend uniformity, flow behavior through the press, compressibility, and mechanical strength. The correlation between particle size and the tendency toward settling in a liquid drug suspension, on the other hand, illustrates the potential link between size and both stability and content uniformity.
While these examples explain the prevalence of particle size measurement, as drug products become more sophisticated and newer analytical techniques emerge, other parameters become important. Particle shape is a prime example. Like size, shape can influence characteristics such as powder flowability and cohesivity, and product performance metrics including dissolution behavior. Shape data may differentiate, for example, between agglomerated or foreign and primary particles, or different crystal forms of an active.
Statistically valid shape analysis opens the possibility of correlating performance with a wider array of physical descriptors. What it does not provide is analysis of the chemical species present, or more specifically, the spatial distribution of different constituents, within the particle structure. Most compositional analysis techniques give only an averaged result: Few bridge the gap between physical and chemical characterization, even though species location is valuable, especially for solid dosage forms with complex, engineered delivery characteristics.
Information about individual species distribution within a particle, agglomerate, or tablet adds to particle size and shape data in a way that allows a more detailed understanding of the material being examined. At the development stage such insight supports the attainment of performance goals and development of robust manufacturing processes, again advocated by QbD. For root cause analysis, as part of QC, this clear understanding enables insightful batch failure investigation, a vital extension of this being counterfeit detection. A well-made counterfeit may be physically and compositionally identical to the original but be readily detectable in the distribution of active ingredient, resulting from alternative manufacturing processes.
Industry-Standard Particle Sizing
Laser diffraction is a technique that measures particle size from 0.02 to 2,000 microns. It is rapid, nondestructive, and suitable for particles in dry or wet streams and for droplet measurement in emulsions and sprays. Easily automated, laser diffraction is available commercially as off-, at-, in-, and on-line particle sizing solutions.
Within the laboratory, fully automated laser diffraction particle size analyzers provide highly reproducible size distribution data. They aid learning throughout the product life cycle, but are especially valuable in early formulation and QC. In the research environment, specialized laser diffraction systems configured for spray measurement capture the evolution of droplet size in real time, extending the technology’s application, especially for characterizing orally inhaled and nasal drug products (OINDPs).
However, laser diffraction is not confined to the laboratory. High data acquisition speed and robust systems designed for continuous operation make this technique a true PAT. Specifications developed in the laboratory can be carried through to pilot scale and into commercial manufacture.
Streamlining Shape Measurement
Analyzing particle shape by traditional manual microscopy is time-consuming and subjective, drawbacks that now are overcome using automated image analysis techniques. These enable the measurement of size and shape parameters through the capture of 2-D particle images and offer statistically improved sampling as well as eliminating operator bias.
Discrete images of each particle create a databank that can be visually interrogated. The images are used to quantify shape in terms of statistically valid descriptors such as circularity, convexity and elongation, which together indicate form and regularity. Image analysis systems are most valuable for in-depth research into the process or the product. For process optimization, detailed characterization of foreign, agglomerated, and aggregated material is especially useful for identifying the source of unwanted particles. With respect to QC, image analysis supplements size data for clear differentiation.
Spatially Resolved Chemical Analysis
Vibrational spectroscopy techniques such as near infrared (NIR) and Raman provide bulk or single-point analysis giving one averaged measure of composition for a whole tablet, granule, or selected portion. Chemical imaging, on the other hand, combines spectroscopic analysis with imaging technology. Instead of just a single spectrum, chemical imaging collects thousands simultaneously from an entire field of view, building an image of the distribution of chemical species — most usefully the API. It is fast and suitable for many formats including whole tablets, granules, and transdermal patches.
Since distribution of the API can affect a formulation’s integrity, bioavailability, and pharmaceutical effectiveness, chemical imaging helps understand the relationship between structure and functionality. It can be used in process applications and for sensitive QC/counterfeit detection, and the ability to combine understanding at the structural level with the measurement of many samples raises the possibility of determining how process variables influence product structure and performance.
Chemical imaging is a laboratory-based tool, suited more to the development of understanding than to continuous process monitoring. However, it does have a role in knowledge-based selection of process control variables. Often, once size, shape, and chemistry correlations are understood, particle size alone can be used to monitor manufacture.
While the strengths of each characterization technique dictate its applicability at different points in the product cycle, the tablet recall example illustrates the synergistic nature of these techniques. Each characterization technique has strengths that make it applicable at different points in the product cycle. However, they are also synergistic.
The greater understanding of what shape (image analysis) or species distribution (chemical imaging) data deliver justifies the reliance on simpler, more easily measured parameters such as size for commercial QC and process monitoring. This is very much the essence of the QbD/PAT approach.
Such synergy is also evident during, for instance, analytical method development for particle sizing. Here the important questions are: Which state of dispersion would yield more relevant data in defining product quality (well-dispersed or agglomerated), and how should the state of dispersion be controlled to ensure robust measurements? The principles of QbD are now being applied to method development and method life cycle management, in order to generate data that deliver in-specification products. Image analysis alongside traditional methods, such as laser diffraction, provides a fast, efficient way to answer these questions, supporting the secure transfer of specifications and methods from one particle sizing instrument to another, from the laboratory into process.
Knowledge-led formulation, development, and manufacture, as well as continuous processing and real-time release, are demanding long-term goals. Analytical tools that work alone, and together, to provide relevant data, within a suitable time frame, are critical to commercial success.
About the Authors
Ulf Willén is divisional product manager analytical imaging systems at Malvern Instruments, which comprises both chemical and morphological imaging systems. Previously general manager of Malvern Nordic, he has been with the company for more than 20 years.
Dr. Paul Kippax is product manager for laser diffraction products at Malvern Instruments and has responsibility for the company’s Mastersizer and Spraytec product ranges. He joined the company in 1997, starting as an application scientist. Prior to this, he obtained a degree in chemistry and a Ph.D. in physical chemistry, both at the University of Nottingham in the United Kingdom.