Magazine Article | November 7, 2010

Improving Workflows In An Analytical Laboratory

Source: Life Science Leader

By Alan Horowitz

Recently, Trish Meek was watching a process being done in a laboratory and marveled at the fact a manual step was required between the laboratory information management system (LIMS) and the analytical lab. “The work is scheduled in the LIMS system, and then they manually enter the data from the samples that need to run rather than just importing that sequence directly from the scheduling system,” says Meek, director of product strategy for life sciences at Thermo Fisher Scientific.

Analytical laboratories face a number of workflow bottlenecks, and instrument integration is an important one. “I find it amazing how often instrument integration can still be a manual process,” says Meek. “Despite the fact that you’re acquiring the data in an electronic system and you are managing your workflow in an electronic system, people are still manually transferring the data to the other because they haven’t taken the time to do an integration project to connect those systems on the back end.”

Manual transferring not only takes time, but also runs the considerable risk of error. Meek says that when someone manually enters results, about an 8% error rate can be expected, which is why a series of review steps, such as blind double entry — which are time-consuming and costly — are necessary.

Michael Blackburn, senior research investigator at sanofi-aventis, comments about lab bottlenecks: “Data checking is a time-consuming, manual process. However, it is vital.”

In fact, analytical labs face a number of bottlenecks, depending of course on what they are analyzing and whether or not they are in a regulated environment. Blackburn notes that in regards to bioanalysis, sample receipt and storage is the biggest bottleneck, followed by sample preparation analysis, then data processing and, finally, data checking and reporting.

When a laboratory is regulated, good laboratory practices relating to quantification of drugs and metabolites in a biomatrix are bottlenecks, notes Pat Bennett, Thermo Fisher’s director of global strategic marketing for pharma.

A succinct description of a laboratory workflow bottleneck comes from David Chiang, CEO of Sage-N Research: “The most significant bottleneck is data analysis, specifically, its accuracy and throughput.” Of particular interest to Chiang is computing power — or lack thereof. Researchers, he says, simply can’t get all of their data processed fully, either because they lack the needed computing resources or they have inadequate algorithms that are oriented to being “economical,” rather than comprehensive. As a result, he notes, “Many scientists agree that they have valuable data locked up inside large datasets with no way to get the results out.”

Matthew Segall, CEO of Optibirum, cites another aspect to bottlenecks. He says a major challenge facing labs is that the quantity and complexity of data is increasing, making it very difficult to identify key rules able to be applied to new compounds as a way of choosing those most likely to succeed. An example he gives is toxicity data, which is coming earlier and earlier in the drug discovery process. The challenge is the early identification of those compounds that will have toxicity issues downstream, while not eliminating those that will succeed. “There’s wasted time and effort [going through so much data],” he says, “but the opportunity costs of throwing something out that would be a good drug are even larger.” The rigorous analysis of data is an area his company is working on.

Overcoming Bottlenecks: Management
Technical and management issues both come into play when trying to overcome bottlenecks. Chiang says that management has been “slow to recognize the need for significant software and hardware sources to fully support their multimillion dollar investments in instruments like mass spectrometers.”

Blackburn thinks management often does not take the time to study various options that would be the best for a particular situation. His solution: “Sometimes doing what everyone else does isn’t the best option. Pick the best solution for your lab, your systems, and your people.”

Bennett emphasizes the importance of management ensuring that a lab’s scientists remain current regarding trends, technologies, regulations, as well as being involved in the implementation of new technologies.

Management might also be a bit behind the times in that it doesn’t quite accept that biology is becoming an “information science.” This creates bottlenecks because traditional IT is centered around expensive hardware running less expensive software programs, or as Chiang notes, “scientists are asked to find software that happens to run on their current computing hardware as opposed to finding the right computing hardware that can best run the most effective software.”

To minimize this bottleneck, management should encourage scientists to find the best software and then support their need to obtain the hardware required to adequately run this software.

Overcoming Bottlenecks: Technical
Technical bottlenecks also need to be addressed if an analytical laboratory is to optimally operate. Thermo Fisher’s Bennett recommends using mass spectrometry to eliminate the need to tune for each compound. Rather than tuning the equipment for each compound, this allows high throughput by going through all the available data and finding what is related to the compound in question.

Blackburn is involved with a sanofi-aventis lab he calls “a classical big pharma ADME [absorption, distribution, metabolism, and excretion] department.” The lab has moved toward multiplexing. Says Blackburn, “This gives us the ability to get the most work out of our valuable triple quads so, for example, we can double up and run stability plates and real samples on the same system simultaneously.”

The lab is using a variety of Thermo Fisher hardware and software, including Watson LIMS, Aria, LCQuan quantitative software, Xcalibur, TSQ Quantum Ultra, and TSQ Vantage triple stage quadrupole mass spectrometers. Also used are products from Waters, including NuGenesis data storage, Masslynx, Quanlynx, and Quattro triple quads. In addition to multiplexing, this setup provides better specificity due to high resolution and greater sensitivity that enables the use of different sampling techniques, such as dried blood spots, says Blackburn.
Meek, in the example cited above, recommends taking the time and effort to integrate instruments. The investment will pay off in the longer term, by eliminating the manual steps that now take up time, create potential risks, and raise expenses.

Best Practices
“The best practice is, first, to start with a robust working workflow, before proceeding to optimize it,” notes Chiang. He generally recommends modeling workflows according to the way a respected lab, which is known for being effective, does its workflows. Learn from what has worked before. He warns, “The most common approach is to start defining ‘the ideal workflow’ from scratch, without being experienced enough to understand the ‘nice’ versus mission-critical tradeoffs that are necessarily part of the dynamic life sciences data analysis fields.”

Another view comes from Meek, who recommends establishing best practices by first understanding the lab’s physical workflow, deciding how to improve that, and then modeling the technical solution to fit those needs. This is better, she suggests, than buying a technical solution and then having everyone fit their practices to the solution’s parameters. “If you just say this laboratory does things this way and everyone else should do it that way, then you always have adoption issues, and you really don’t get the enhanced performance out of the laboratory that you are hoping for,” she says.

Sanofi-aventis’ Blackburn says, for best practices, pick good analysts and train them well; use electronic data storage to cut down on paperwork and archiving; and take the time to develop high-quality, specific methods. With equipment, he says preventive maintenance is a vital best practice. Heavily used systems, he says, “don’t wait for problems to happen. They take twice as long to fix [than does preventive maintenance].”

There is no simple solution for fixing workflow issues in an analytical lab. But smart use of technology, a willingness to invest in analytical and management software, and the necessary hardware and management interest in training personnel and motivating them will go a long way to unclogging those areas of the lab where things are backing up.