OR WAIT null SECS
The authors consider several common techniques for verifying the accuracy of liquid-handling equipment and offer guidance for finding the appropriate technique for a given instrument.
High-quality, precision liquid-handling instruments have tended to give scientists throughout the drug-discovery, testing, and production processes a sense of confidence in their data. However, the large amount of resources dedicated to drug development, the long US Food and Drug Administration approval process, and the numerous recalls and legal actions plaguing well-known drug companies suggest that more attention should be paid to quality assurance. In particular, liquid-handling processes—the core of pharmaceutical laboratory operations—demand the application of robust, rigorous, science-based methods and tools to ensure data quality.
Table I: Strengths and weaknesses of verification methods.
In life-science laboratories, which commonly include technologically advanced instruments, scientists often have various tools to complete everyday tasks such as liquid-handling quality assurance. Several options are available to laboratories for calibrating liquid-handling instrumentation and measuring the efficacy of liquid-handling processes. Each option has its own applications, benefits, and drawbacks. The optimal technology for a laboratory application depends on factors such as the volume of liquids to be quantified, the type of instrumentation used, and the applicable regulatory and quality standards. Also to be considered are the laboratory environment, tolerance for risk, required calibration frequency, and the demands of the laboratory's processes.
This article will compare gravimetry, fluorometry, single-dye photometry, and ratiometric photometry (common means for verifying liquid-handling instrumentation) and provide data and guidance regarding best applications of each.
Laboratories have traditionally relied on gravimetry to measure the performance of liquid-handling devices. This method uses a balance to weigh liquid volumes. The balance reports a weight, and that weight is converted to mass and then to volume using conversion factors that may be found in tables, calculated from formulas, or produced by software packages.
Gravimetry has several advantages, including the wide availability of weighing devices in most laboratories. In addition, gravimetry is a well-accepted technology. It is recognized by national and international regulatory agencies, including the International Organization for Standardization (ISO), the College of American Pathologists, and ASTM International. Published standard methods of gravimetry include ASTM E1154 and ISO 8655-6 (1, 2). Gravimetric calibration can also be traced to national standards, thus facilitating regulatory compliance and standardization.
Gravimetry is frequently the method of choice for measuring device performance when handling larger volumes. For example, a 1000-μL aliquot weighs approximately 1 g and can be weighed reliably on a modern laboratory analytical balance. The current trend in laboratories toward handling small liquid volumes with automated devices illustrates one major drawback of this method: as volumes decrease, weighing becomes more challenging for several reasons.
First, measuring small liquid volumes requires specialized balances that produce measurement results to five or six decimal places on the gram scale. Such balances are delicate, require a stable platform to limit vibration, and are not as portable as the less-sensitive models used for measuring large liquid volumes. These requirements often make microgram balances unsuitable for use on the deck of automated liquid handlers. Illustrating the need for sensitivity, ISO 8655-6 requires that volumes of 10 μL or less be measured on a six-place (microgram) balance (2).
Because microgram balances take time to settle, gravimetric calibration can also be time-consuming. In addition, gravimetry is affected by various environmental conditions, including evaporation and static electricity. As volumes become smaller, these error sources become more significant.
For example, modern dispensing equipment can deliver volumes so small that they can evaporate in seconds. Obtaining adequate resolution for small volumes requires a highly sensitive balance with complicated evaporation traps, static eliminators, and vibration dampeners. Other methods for controlling for evaporation can be complicated. One method is to measure the evaporation rate and correct for the resulting volume variation. Alternatively, the humidity in the room can be increased or a draft shield built to prevent air from flowing over the testing area. These steps add time and complexity to the measurement process.
Electrostatic effects also cause uncertainty with gravimetric methods because plastic pipette tips are typically used to transfer liquids. The static electricity that is imparted to the balance pan or the draft shield induces a force that affects measurement accuracy. When working with small volumes, the error can be significant. Vibration must also be controlled for, often by calibrating the instrument in a controlled environment on a solid marble bench.
Because gravimetric measurements calculate volume by converting weight to mass and then to volume, accurate calibration is contingent on knowing the density of the fluid being pipetted. Many laboratory technicians assume the fluid being measured has the density of 1 g/mL, which is the approximate density of water. Although common solutions do have published density values, the densities are not always known to a high degree of accuracy.
To illustrate the possible uncertainty, consider dimethyl sulfoxide (DMSO). Its published density is 1.1 g/mL. The density is published with limited resolution, using only two significant figures. In addition, the density of DMSO changes according to its water content, which depends on the starting water content and the time it is exposed to ambient local temperature and relative humidity. Even the density of water varies with temperature and, at room temperature, is always less than 1 g/mL, its commonly accepted value.
These details must be accounted for if precise measurements are required. Consider a device with accuracy specifications of better than 0.6%, which is a typical specification for high-accuracy pipetting of 1000 μL. Failure to correct for density errors, even when pipetting water, can lead to error in the 0.3–0.5% range, which is nearly as large as the acceptable error for the entire piece of equipment. When acceptable tolerances are in the 5–10% range, however, density considerations are much less important.
An alternative to referring to published density values is to measure density with a commercial densitometer or pycnometer. For reliable results, these instruments require calibration as other laboratory instruments do. Care must be taken to avoid measurement error.
One last drawback of gravimetry is its inability to simultaneously measure each individual channel in multichannel liquid-handling devices. With gravimetry, individual aliquots can be measured or multiple dispenses may be made and the total weight used to calculate the average volume. To measure the performance of single channels, each channel must be tested one tip at a time. This process is time-consuming and tedious. Testing each channel one time in a 96-channel device, for example, would require 96 dispenses.
In summary, gravimetric calibration is best suited to measuring the performance of single-channel devices that handle large liquid volumes, usually those above 200–1000 μL. The precise lower limit for effective use of gravimetry depends on the tightness of the tolerance to be met and the quality of the measuring equipment and procedure used.
During fluorometric calibration, a beam of ultraviolet light is shone on a sample at one wavelength, called the excitation wavelength. This exposure causes the molecules to absorb light and enter an excited electronic state. Release of this excess energy results in the emission of light at a different, longer wavelength, called the emission wavelength. A detector is used to measure how much light is emitted at the emission wavelength. Precision is measured by comparing relative fluorescence levels in different samples.
Fluorescent dyes are photoactive and generate a strong signal at low volumes. Small samples can generate large signals at low concentrations, which facilitates fluorometry's use in measuring small volumes. Measurements of volumes as small as 5 nL are possible.
A major drawback of fluorometry is the difficulty in achieving robust traceability, which often prevents its use in regulated laboratories. This deficiency results from the fact that the strength of the fluorescent measuring signal varies depending on the local chemical environment. Factors such as solvent composition, pH, ionic strength, redox potential, and time can alter the signal strength. For this reason, during a given measurement, the volume in a well can be compared with a volume in the previous well, provided that all volumes have similar chemical compositions. It is difficult, however, to compare measurement readings day-to-day, assay-to-assay, or location-to-location unless traceability is established. Traceability is typically established by developing a standard response curve using a calibrated pipette or other traceable liquid-delivery device.
The accuracy and traceability of this standardization depend on many factors, and at small volumes (for which fluorometry is most often used), standardization can be difficult. For this reason, fluorometry is most often used to determine precision only, and not accuracy, leaving the user to estimate how close the actual dispense is to the desired volume. Work is currently in progress to develop better traceability for fluorometric calibration methods.
Fluorescence methods are also affected by quenching and photo bleaching. Fluorescent dyes can chemically degrade over time and are sensitive to temperature and pH. Some dyes are buffered, meaning they contain chemicals to prevent the pH from changing. Unbuffered dyes, however, suffer from pH shifts as the dyes absorb carbon dioxide from the air and become acidic. This pH shift can affect the accuracy of the measurement reading. Because the properties of fluorescent dyes can shift in the course of hours, standard curves should only be relied on for short periods of time. In addition, no fluorometric calibration technologies are commercially available, although some methods have been published in the scientific literature (3, 4).
In summary, fluorescent calibration is best suited for demonstrating precision in nearly identical conditions when testing small liquid volumes and when accuracy and traceable measurements are not required.
Photometric calibration requires a photometer and stable dyes that absorb light in the visible or ultraviolet range. To use single-dye absorbance photometry to measure volumes, a dye solution is delivered into a cuvette, a measuring cell, or a clear-bottomed microtiter plate. A beam of light at a specified wavelength is passed through the solution, and the photometer measures the quantity of light that passes through. The amount of light that is absorbed is proportional to the amount of dye present, which permits a volume determination to be made.
The photometric method produces precise measurements and is less sensitive to environmental conditions than gravimetric and fluorometric calibration technologies. In addition, although photometric dyes change with temperature and pH, they tend to be more stable than fluorescent dyes. Hence, the response from the photometric reader will be comparatively consistent. In addition, photometry is typically immune to other chemicals that can have a large impact on a fluorescence signal. Photometry, therefore, is better suited than fluorometry for making accuracy determinations.
Another benefit of photometric calibration methods is the ability to provide information about each channel in a multichannel device. Absorbance dyes that are readily available and commonly used include tartrazine and potassium dichromate. A commercially available single-dye method for single-channel pipettes is commonly used in the clinical-laboratory industry.
ISO 8655-7 recognizes the technique of single-dye photometry for liquid-handling device calibration (5). According to this standard, however, photometric methods should be accompanied by an uncertainty analysis that describes the measurement uncertainty. This analysis may include error contributions such as accuracy of the photometer and reagents, dye instability, deviation from ideal Beer's Law behavior, and the like.
To account for the dyes as a source of error, data on the stability of the dye, either from the manufacturer or developed in-house through a stability or validation study, is important. Because light is passed through the sample and an optical wall, the optical quality of the microtiter plate or cuvette used in the method can affect the accuracy and precision of the measurement, and laboratories must also account for this.
Like all dye-based methods, photometric methods must be properly standardized to obtain quantitative results for accuracy measurements. The traceability of the method depends on many factors, including how carefully the standardization is carried out. For traceable photometric readings, a standard curve must be developed by using a known liquid-delivery device (e.g., a calibrated pipette) or by weighing volumes. This process can be time-consuming and tedious. In addition, it assumes that the liquid-handling device used to develop the standard curve is reliable, and this assumption adds a level of uncertainty.
In summary, single-dye photometric calibration is well-suited for measuring precision, particularly when handling volumes too small to be weighed on a balance. Accuracy measurements can also be made, but their robustness is limited because of the difficulty of ensuring that the method is properly standardized and that an uncertainty analysis yields acceptable performance.
The ratiometric photometric calibration method is a refinement of photometry designed to overcome the accuracy limitations of traditional single-dye photometric volume measurements. Ratiometric photometry uses two standardized dyes, and its measurement process produces absorbance readings in pairs that can be combined into absorbance-ratio readings.
The primary benefit of this approach is its ability to improve the accuracy and robustness of measurement comparison with nonratiometric methods. Absorbance ratios can be measured more accurately than individual absorbances, leading to a higher degree of accuracy and precision in ratiometric methods versus traditional single-dye photometric methods. The underlying reason for the improved measurement ability of ratiometric photometric methods is that the absorbance of photometric calibration standards drifts over time, while ratios exhibit greater stability.
Compared to gravimetry, this method offers greater speed, ease-of-use, and enhanced accuracy in small-volume measurements. Compared to fluorometry, ratiometric photometry provides accuracy as well as precision measurements and can do so to a traceable standard because the dyes function as an internal standard. Measuring the second dye in comparison to the first dye provides a nearly automatic compensation for the most common photometric error sources.
Systems based on ratiometric photometry provide information about each individual channel in multichannel devices and good plate-to-plate reproducibility. For ratiometric photometry to produce benefits, however, it must use well-characterized plates and carefully calibrated solutions of good stability.
In addition, to function properly, ratiometric photometric methods require specially formulated dyes to produce accurate absorbance ratios. Lastly, this technology is not always preferred when measuring only large volumes because other technologies may produce adequate measurements more cost effectively.
In summary, ratiometric photometry calibrations provide great benefits when measuring small liquid volumes for protocols requiring traceability and a high degree of accuracy per channel as well as precision.
Pharmaceutical laboratories have varying protocols, processes, and requirements, and these elements can affect the choice of calibration technologies for liquid-handling devices. Gravimetry, fluorometry, single-dye photometry, and ratiometric photometry are common means for verifying liquid-handling instrumentation. Each technique has its own advantages and disadvantages. Understanding the assay and laboratory quality requirements, traceability needs, and tolerance for error as well as the level of accuracy and precision required can help laboratories make the right decision.
Richard Curtis, PhD*, is chairman and chief technology officer, and George Rodrigues, PhD, is a senior scientific manager at ARTEL, 25 Bradley, Westbrook, ME 04092, tel. 207.854.0860, fax 207.854.0867, firstname.lastname@example.org.
*To whom all correspondence should be addressed.
Submitted: Jan. 14, 2008. Accepted: Feb. 4, 2008.
1. ASTM International, "ASTM E1154-89 (2003) Standard Specification for Piston or Plunger Operated Volumetric Apparatus," (West Conshohocken, PA, 2003).
2. International Organization for Standardization, "ISO 8655-6: Piston-Operated Volumetric Apparatus—Part 6: Gravimetric Methods for the Determination of Measurement Error" (Geneva, 2002).
3. P. Taylor et al., "A Standard Operating Procedure for Assessing Liquid Handler Performance in High-Throughput Screening," J. Biomol. Screen. 7 (6), 554–569, (2002).
4. J. Petersen and J. Nguyen, "Comparison of Absorbance and Fluorescence Methods for Determining Liquid Dispensing Precision," JALA 10 (2), 82–87, (2005).
5. ISO, "ISO 8655-7: Piston-Operated Volumetric Apparatus—Part 7: Non-Gravimetric Methods for the Assessment of Equipment Performance" (Geneva, 2005).
What would you do differently? Email your thoughts about this paper to email@example.com and we may post them to the site.