This Special Advertisising Section is brought to you by AAAS OPMS

Laboratory Technology Trends: Advances in Lab Instrumentation



Sections

Strategies for Survival

Fluid Mechanics
and Cellular Pathways

From Test Tubes to Plates

Speeding Up Screening

Inside Live Cells

New Technologies to Read Plates

From Automation to Robotics

Weblinks

Advertisers

Labsystems www.labsystems.fi

Personal Chemistry www.personalchemistry.com

Featured Companies and Organizations

Applied Biosystems www.appliedbiosystems.com

Beckman Coulter www.beckmancoulter.com

BD Biosciences www.bdbiosciences.com

Cellomics, Inc. www.cellomics.com

Eppendorf Scientific, Inc. www.eppendorf.com

Millennium Pharmaceuticals, Inc. www.mlnm.com

Molecular Devices www.moldev.com

Ortho Diagnostics Systems, Inc. www.informagen.com

Packard Instrument Company www.packardbioscience.com

Pfizer www.pfizer.com

Robbins Scientific Corporation www.robsci.com

Universal Imaging www.image1.com

Zymark Corporation www.zymark.com

by Peter Gwynne and Gary Heebner

As the pace of life science R&D accelerates, laboratories face heavy demands to produce timely, accurate results at reduced costs. The solution: automation of lab instruments.

About 30 years ago, handheld calculators began to find their way into the hands of college students and laboratory researchers. The more technically minded individuals promptly discarded their slide rules and turned to the new inventions that promised to process more and more complex data sets.

The era of the handheld calculator lasted about a decade. Then the desktop personal computer arrived and quickly overshadowed calculators. Since then the computer industry has created a succession of smaller, faster, and smarter versions of the PC, with each new generation changing the lives of its users a little more. And just as the handheld calculator quickly found a place in the lab, personal computing in its various forms has become a staple of bench research. Indeed, it has exerted a profound impact on the productivity of individual scientists and research teams.

It has certainly come to the aid of today's laboratory manager, who faces constant pressure to produce and deal with rapidly growing amounts of data. "We're seeing a dramatic increase in the compound file," says David Giegel, director of High Throughput Screening (HTS) Technology at Pfizer's Ann Arbor Laboratories. "Whereas in the 1980s we were talking about maybe tens of thousands of compounds in a library, now we're talking about millions."

The increase in compounds to be screened and the resulting pressure on lab managers stem largely from three factors. The increased pace of R&D for new products, the need for better quality control in industrial settings, and increasing cost control and government regulations in the clinical setting have combined to create numerical targets for screening undreamed of in past decades.

Those factors, in turn, stem from a fundamental need of the modern pharmaceutical industry. "We have found that the pharmaceutical companies have a problem maintaining their P&E ratios," explains Ed Alderman, director of pharmaceutical research sciences at Zymark Corporation. "They have to turn over five to 10 big chemical entities per year, not the 0.5 they do now. Senior management wants to quadruple throughput while reducing costs by 30 percent."

Strategies for Survival

To survive in the increasingly competitive environment, laboratories must produce results in a timely fashion with fewer errors and at lower costs. They must also find ways to utilize highly skilled laboratory personnel better by eliminating the redundant, low value tasks that they perform. "We're very much driven by the need for throughput," says Stephen Oldfield, director of marketing for Molecular Devices Corporation. "People are wanting to reduce costs. High throughput and low cost are what we need."

Automation offers a solution to many of these requirements. Its benefits include increased productivity via the ability to process larger numbers of samples per unit of time, a reduction in human errors caused by repetition and sheer boredom, and lower labor costs, which are a major component in testing procedures. "The field is technology-driven," says David Litman, senior vice president of R&D for BD Biosciences. "There's no other way to get the information out at a high enough throughput with enough parameters." The results of automated systems, Litman adds, give bench scientists the chance to ask a lot more questions.

Automation definitely has the potential to pay for itself quickly. "Three years ago 10,000 wells screened per day were fine," says Seth Cohen, director of Millennium Pharmaceuticals' HTS Department. "Now it's tens of millions of wells per year. So if you can get the cost down to a penny a well from a dollar per well you're saving enough to pay for a new, more sensitive detector the first time you use it."

To cut the costs and improve the productivity of bench science, vendors of laboratory equipment are streamlining their instruments in several ways. Scientists have just about got used to 384-well plates that have cut the cost of reagents and the time taken for assays by about a half when compared with the more familiar 96-well plates. Now, they face the prospect of using 1,536-well plates. "We could do 50,000 to 70,000 data points per day with 96-well plates," says Giegel. "With 1,536 wells we could do upward of half a million data points per day, with a concomitant decrease in costs."

Fluid Mechanics and Cellular Pathways

To make the large well numbers more attractive, efforts are also under way to reduce the amount of reagent necessary to produce each data point. "The research community has always wanted to work with many of their solutions in the submicroliter range," explains Sven B±low, director of product management at German company Eppendorf AG. That will require dramatic advances in the physics of liquids. "We're dealing with different fluid mechanics," says Giegel. "Shaking won't mix liquids at the 2- to 10-microliter scale, but you have too large a volume for diffusion to work. So you have the worst of both worlds."

Another area that demands new approaches is understanding the multitude of molecular interactions inside cells. "Clearly live cell analysis is one of the major areas of focus in bioscience," says Litman. "We think that one of the next frontiers, important for all biomedicine, will involve how all the signaling pathways operate at the cellular level. We're developing a powerful toolset, including imaging, laser scanning, and flow cytometry to get at those answers." Those technologies and others promise to unscramble some of the mysteries of cellular interiors. "It used to be like the Heisenberg principle: You could precisely identify an individual cell, or describe the average biology of a population of cells, but you couldn't accurately tell what was going on in a specific cell," Litman continues. "Now the technology is leading to the point at which you can hope to know what's happening in each cell at a molecular and functional level."

The exponentially increasing numbers of data points put extra pressure on bench scientists. To permit them to use their time in the lab for interpreting rather than preparing their assays, instrument makers are taking the next obvious step beyond automation. "We make our assays compatible with robotics systems," says Oldfield. "We have to think about how we work with robotics."

From Test Tubes to Plates

Most laboratory procedures can be broken down into three steps: sample preparation, sample analysis, and data management. Once computers arrived, the accuracy of testing and the ability to analyze complex sets of data soared. With data management dramatically improved, the work of preparing samples and running them through experimental protocols became the bottleneck in advancing productivity. In the 1980s the advent of automated systems brought together different hardware devices to accomplish specific laboratory tasks. Decreasing the time required for sample preparation was the next major step in improving lab productivity and quality.

The majority of the early progress in laboratory automation took place in the area of liquid handling. As sample volumes necessary for a reaction decreased from several milliliters to microliters, experiments evolved from being conducted in test tubes to being performed in microtiter plates. The plates possessed several unique attributes that have made them the gold standard in handling many small volume samples. The precise and predictable positioning of its wells permitted the use of multi-channel pipetters to dispense liquids into the plates. But while they offered a major improvement in productivity the plates didn't solve all the problems. Lab technicians continued to make mistakes and to forget which wells of a plate had received a chemical reagent. The procedure of pipetting into many rows of these 96-well microtiter plates was a tedious and boring job.

Thus the next step in liquid handling involved the development and introduction of automated stations to handle liquids. Originally considered luxury items, these instruments soon became common laboratory tools for researchers working with large numbers of samples. The stations allowed laboratory personnel to prepare microtiter plates with a limited amount of operator intervention and greatly improved accuracy. However, the instruments still needed to be hand fed with empty plates, and the filled plates needed to be removed, also by hand in most cases.

Today, several companies, including Packard, Robbins Scientific, and Zymark, offer automated liquid handling systems with software to program the steps involved in adding reagents to each plate. Technicians enter filling instructions into the instruments, often using user-friendly Windows-based software programs. These programs also allow the process to be monitored and documented for auditing purposes.

Eppendorf is taking the process a stage further. "We're working on an enzyme dispenser that will have the same handling characteristics that people are familiar with in their everyday microliter pipettes," says B±low. "It will accommodate small cartridges that can contain restriction enzymes or a very expensive, invaluable, or just isolated protein. It can then dispense the liquid in volumes between 10 nanoliters and one microliter with high precision."

Speeding Up Screening

During the past 10 years, combinatorial chemistry and high throughput screening have profoundly influenced researchers' ability to produce and evaluate a group of chemical compounds for any number of purposes. This technology helped to decrease the time and amount of material needed to produce and screen many chemical compounds. Combinatorial chemists can produce a large number of compounds with varying but related structures, which can then be screened for biological activity.

The pharmaceutical industry benefits most obviously from combinatorial chemistry. In the past, a medicinal chemist working for a pharmaceutical company might have been able to produce several new synthetic compounds each week. Today this same chemist using combinatorial chemistry techniques produces hundreds of related chemical compounds in a single day. Pharmaceutical companies are now building vast libraries of chemical compounds numbering in the hundreds of thousands and larger. High throughput screening operations are analyzing up to hundreds of thousands of samples in a single day. This creates a need to process more compounds than ever before.

The microtiter plates that handle these compounds have already changed from 96-well to 384-well versions. Recently ultrahigh throughput screening (UHTS) instruments have started to reach the laboratory. Based on a 1,536-well microtiter plate format, these instruments possess robust liquid handling, specimen reading and data analysis capabilities. "We have a push for 1,536 wells because we've got a lot more targets coming our way," says Giegel. "We haven't even started to mine the data from the Human Genome Project yet."

The UHTS instruments permit researchers to process over 100,000 samples per day. They use sample volumes of as little as 5 to 10 microliters rather than the 100 microliters required for conventional HTS. That means smaller amounts of reagents, a factor that can encourage scientists to screen more expensive or more precious substances.

Designing equipment that can handle such tiny volumes of liquid poses difficult problems for manufacturers. The next step offers even tougher challenges. "The research community has always wanted to work with solutions in the submicroliter range," says B±low. "But it's not possible to get accurate delivery with conventional air-displacement pipettes. If you try to deliver 0.1 microliters you might end up delivering twice as much liquid. That has very strongly inhibited the adoption of smaller dimensions in PCR assays and other areas. Scientists have to use wastefully high volumes to assure that they get in the plus or minus 10 percent range of the amounts they need."

Handling submicroliter volumes of fluids demands expertise beyond that of traditional life scientists. "We have chemical engineers and consultants from engineering departments who are experts in fluid dynamics," says Giegel. Eppendorf, meanwhile, has developed a prototype that shows promise in the lab. "We have done lots of experiments with restriction enzymes, polymerases, and very fragile enzymes," B±low reports. "We've shown it's possible to use tiny volumes of these enzymes without losing activity and to handle liquids spanning a wide range of viscosities without any need for recalibration."

Inside Live Cells

In addition to assaying minuscule volumes of compounds, researchers want to examine events inside cells as they occur. The technology for doing so has improved since the early 1970s, when Becton Dickinson introduced the first commercial flow cytometer, the FACS-1. A few years later Ortho Diagnostics followed suit with its version of this powerful cell analysis instrument. The flow cytometer enabled researchers to characterize populations of cells based on inherent cellular properties or fluorescent labels used to tag certain markers in or on the surface of specific cells.

Coulter Electronics then introduced a flow cytometer with multiparameter sorting capability. These instruments quickly became much smarter, allowing researchers to measure more parameters in a single experimental run. By the mid-1990s flow cytometers had advanced from sorting several thousand cells each second to over 50,000 cells per second.

Researchers use several other cell-based assay formats to optimize leads and to predict possible toxic effects on target cells. These formats include viral titer assays and fluorescent signal assays. Cellomics, Universal Imaging, and others have developed instruments and software for high content screening. This allows automated measurement of such complex cellular activity as morphological changes, apoptosis, adhesion, and protein trafficking in the living cell. These measurements add another layer of screening to identify relevant information about the effects of compounds on cells. Coupled with software of image acquisition, analysis, data review, and data reporting, these instruments bring screening to another level.

These automated systems allow scientists to analyze molecular interactions within living cells. Such "whole cell" systems have several advantages over in vitro compound-target measurements. Scientists can evaluate molecular interactions in the context of the actual intracellular environment. They can evaluate drug penetration in whole cell studies. And whole cell assays eliminate the need for protein purification and expression steps.

New Technologies to Read Plates

The success of live cell analysis techniques depends in large part on the ability to read accurately multiwell plates, and particularly the 1,536-well versions. "We're using imaging technologies a lot more. We're moving to cameras, charge-coupled device (CCD) chips, and laser-based microscopy," says Cohen. Millennium uses North Star, a CCD system from Applied Biosystems that looks for luminescent reactions on the plate, and FMAT, a laser scanning method developed by Applied Biosystems and BD Biosciences. "Because it's laser scanning there's no wash involved," points out Litman. "You can do assays on particles and cells, and it's the only method of doing high-throughput apoptosis assays."

Molecular Devices has improved FLIPR, the kinetic imaging plate reader introduced in 1997 that allowed live cell assays to be used in primary screening. "We have introduced chemistry to enable a very rapid no-wash procedure," says Oldfield. "Our FLIPR calcium assay kit reduces labor costs as there's much less hands-on time."

The company now plans to expand the FLIPR technology. "We have liveware products with chimeric G-proteins so that you can screen for receptors that would normally be signaling through cyclic AMP," Oldfield says. "And our next kit, coming out this month, is designed to assay ion channels with a throughput as high as that of calcium screening."

Another series of instruments from Molecular Devices, CLIPR, relies on chemiluminescence rather than fluorescence. "We use a very sensitive CCD camera that gives very flat field imaging," says Oldfield. "The lensing system from Affymax permits you to see everything on the plate without distortion."

Physics principles continue to play a prominent role in the instruments to examine live cells. "In our cytometry division there's a lot of applied research in areas of solid-state lasers and fluidics," says Litman. "In engineering we have a lot of focus on high-speed digital processing. We're developing technology in-house and in collaborations to identify specific cytotoxic lymphocytes."

From Automation to Robotics

Automation systems are usually a combination of hardware devices interfaced together to perform specific laboratory procedures, such as filling microtiter plates with a substrate or other reagent. These can be very simple, dedicated workstations that perform a task or a group of tasks such as sample dilution, filtration, or the addition of reagent. The functions are fixed and often specific to a particular task or experiment.

In contrast, laboratory robotics systems can usually perform many functions and can be designed and programmed to meet specific laboratory needs. They enable computers to do physical work in addition to processing data. These systems are very flexible in that they can be redesigned and reprogrammed to meet the changing needs of a laboratory environment.

Zymark, an industry leader in laboratory robotics, uses a modular system of robotics, computers, and hardware that allows for the expansion of capabilities without obsolescence. Its systems can be configured to use existing balances, titrators, or other instruments found in a laboratory. "Our new SciCloneTMliquid handler has an x-y-z gantry head and a 15-position deck," says Alderman. "It can have either fixed tip or disposable cannulas. Attached to the head are up to four dispensers of bulk reagents. We can also put a stacker unit behind the head. This provides multitasking capabilities and increases storage capacities so that people can put these things on and walk away."

Eppendorf has recently devised a robotics system that combines its own DNA purification chemistry with a Zymark platform. "It has a throughput of 50 plates in 12 hours and 100 in a 24-hour run," says B±low. "As far as we know it's the only system with full walkaway capacity."

The growing ability of robotics systems to generate data puts pressure on data management. "Someone has to translate the data into usable information and knowledge," says Alderman. "We're working with various third-party companies to develop expert systems. We're clearly hardware-based, but we're becoming more and more a software company."

Automation and robotics in laboratory instruments will continue to evolve in response to the needs of customers. The tools will become even faster, smaller, and smarter, enabling researchers to gain new insights into the process of drug development and ultimately the treatment of disease. Pharmas and biotechnology companies can't afford to stop their efforts to improve. "It's a fantastic field as it's driven by biology, chemistry, robotics, and informatics," says Millennium Pharmaceuticals' Cohen. "You can improve by going in any of those four directions, but if you stand still, two years from now you'll be far behind."

WEBLINKS

Advertisers

Labsystems www.labsystems.fi

Personal Chemistry www.personalchemistry.com

Featured Companies and Organizations

Applied Biosystems www.appliedbiosystems.com

Beckman Coulter www.beckmancoulter.com

BD Biosciences www.bdbiosciences.com

Cellomics, Inc. www.cellomics.com

Eppendorf Scientific, Inc. www.eppendorf.com

Millennium Pharmaceuticals, Inc. www.mlnm.com

Molecular Devices www.moldev.com

Ortho Diagnostics Systems, Inc. www.informagen.com

Packard Instrument Company www.packardbioscience.com

Pfizer www.pfizer.com

Robbins Scientific Corporation www.robsci.com

Universal Imaging www.image1.com

Zymark Corporation www.zymark.com

Note: Readers can find out more about the companies and organizations listed by accessing their sites on the World Wide Web (WWW). If the listed organization does not have a site on the WWW or if it is under construction, we have substituted its main telephone number. Every effort has been made to ensure the accuracy of this information. The companies and organizations in this article were selected at random. Their inclusion in this article does not indicate endorsement by either AAAS or Science nor is it meant to imply that their products or services are superior to those of other companies.


Peter Gwynne is a freelance writer based on Cape Cod, Massachusetts, U.S.A. Gary Heebner is president of Cell Associates, a scientific marketing firm in Chesterfield, Missouri, U.S.A.

The companies in this article were selected at random. Their inclusion in this article does not indicate endorsement by either AAAS or Science, nor is it meant to imply that their products or services are superior to those of other companies.

This article was published
as a special advertising supplement
in the 29 September issue of Science