Good Automation Practices in the Laboratory

, ,

Pharmaceutical Technology's In the Lab eNewsletter

In the Lab eNewsletter, Pharmaceutical Technology\'s In the Lab eNewsletter-06-03-2020, Volume 15, Issue 6
Pages: 44–45,58

Laboratory automation can reduce the need for in-lab user presence but requires efficient and dependable user-system interface.

Introduction

Why is the modern (bio)pharmaceutical research laboratory automated? 

There are many obvious benefits to automation, such as increased accuracy, traceability of data and standardization, but these do not get to the heart of why automation is implemented.

The most immediate benefit is that automation allows a research scientist to get on with performing their laboratory work. They can be confident that the automated systems will correctly perform the required analysis or experiment as required, thus enabling them to accurately and rapidly move through their research responsibilities. While it is (correctly) expected that research scientists are experts in the techniques they use and in their particular area of study, it is not feasible to expect them to be experts on the minute functioning of the instruments they use in the laboratory. Research is ever more focused on applying multiple techniques to deliver answers to big problems. In the specific case of pharmaceutical research, this is to provide novel medicines to patients to relieve suffering and preserve life, whilst providing profitable growth platforms for the companies that supply the medicine.

This can be summarized effectively by the statement often heard in biopharmaceutical development environments, “Let the scientists do their science.”

In addition to the operational advantages of automation, it is crucial to consider the interface between the research scientist and the automated system. Complex and unhelpful software user interfaces, or hardware setups that are inefficient and unreliable, do not support the overall goal of enabling the researcher to be as productive as possible. Therefore, the impact of good interface and interaction design must be considered alongside some of the harder design and performance requirements of any automated system.

These requirements do not stop “at the robot.” The modern scientist will also be accustomed to automation in data analysis and interpretation and, quite possibly, data storage and indexing (1).

From scientist to robot

It is undeniable that automation in the laboratory has been directly enabled by the expansion in functionality-and reduction in cost-of modern computing. Many, if not all, laboratories contain several computer-controlled instruments that can be thought of as self-contained automated systems. Such systems may be able to take multiple readings from a single sample over time or take multiple simultaneous readings, such as multiple temperature and pressure readings of a calorimeter.

Although most systems will require some level of physical interaction, in many cases this is as simple as loading the sample before operation and removing it afterward. The main point of interaction between the user and the device will primarily be via its control interface. In many systems this will be either through a computer screen; a touch-screen control on the device itself; or a tablet personal computer (PC). Alternatively, this interaction is mediated through a higher-level piece of software, such as a laboratory information management system. Interaction with the control interface is often much more extensive after an experiment because data analysis and management tasks are conducted.

Many factors drive the requirement for a good user experience and interface design, not least of which is the growing prevalence of interacting with apps and various devices in almost every aspect of life. In these circumstances, good design and user experience (UX) are essential and, as a result, are increasingly influential factors when laboratories are making automation purchasing decisions.

From a user’s perspective, (good) UX design is ultimately centered around satisfying their “needs”, enabling them to complete their tasks/fulfill their objectives in an easy, safe, and effective manner. This can include:

  • Ensuring that users can interact with information within a particular system in a manner of their choice, rather than forcing them to adapt to new methods.

  • Providing users with clear instructions when opening different parts of the platform for the first time.

  • Refining terminology to minimize mistakes and ensure all content is clear at a first glance.

  • Modifying the types of feedback provided to the user when the physical equipment has completed a task enacted within the software.

  • Identifying the most appropriate iconography for certain actions (both field-specific and for more general interactions).

  • Embedding new interaction flows (such as the ability to duplicate steps in experimental plan creation) to ensure that as wide a user base as possible can interact with the same fundamental software.

It may also be beneficial to use a common software platform across multiple instruments in the lab to provide a consistent and easy-to-navigate user interface. This would also minimize the amount of required user training.

From screen to bench

How does the right design and UX translate from the screen to the experiment on the bench?

By way of example, we will consider a potentially dangerous experiment where a new reaction is being conducted at elevated temperature and pressure-a typical experiment that might be conducted during drug development, safety testing for scale-up, or process optimization.

There is the explicit requirement that the automated system is safe to operate under those conditions (i.e., that it is rated appropriately for elevated pressure, and, similarly, that heating the system poses no risks to the user or the system itself).

The next level of considerations are user-defined criteria, relevant to the experiment to be conducted, such as:

  • What is the maximum temperature, or pressure, that the reaction should not reach?

  • Are there warnings that the user would like to see even when the reaction is below the maximum pressure and temperature levels?

  • If the maximum safe temperature or safe pressure are reached, what shut-down protocols are necessary to apply?

  • If the experiment exceeds either of these thresholds, what emergency measure should be taken (such as crash cooling, addition of inhibitors, venting)?

Appropriate laboratory automation is ideally suited to run such experiments-providing high-quality data while protecting the user by:

  • Precisely defining the experimental protocol in advance of experimental work.

  • Accurately conducting the defined protocol, including real-time monitoring and adjustment of experimental conditions.

  • Recording multi-dimensional data at an appropriate, user-defined frequency (multiple times per minute, or even per second).

  • Enabling safe operation through:
  • Allowing remote operation – once started, the user may be physically distant from the system, and is often able to remotely access real-time status updates through on-line connectivity.

  • Identifying potentially hazardous experimental conditions.

  • Automatically activating user warnings, or implementing safe stop actions.

The remote operations offered by automation also has the potential to significantly increase laboratory productivity. This has been demonstrated in both academic research facilities and privately-owned companies. The need to do more, often with smaller laboratories and less staff, is a key driver for implementing automation. Indeed, a 2018 article discusses a ten-fold increase in sample number in one laboratory, with a 20% staff reduction seen in another (2).

Efficiency and productivity can also be greatly enhanced through automation and software by connecting systems together, for example, a flow cell reactor for a catalytic process can automatically feed reaction products into a mass spectrometer for analysis. The flow reactor can be programmed to operate under a range of different conditions, such as pressure or flow rate of reagents, during any given run. As the reactor runs this program automatically, the results from specific experimental conditions may be fed into the mass spectrometer under the control of software coordinating the actions of both systems (3).

As such, running this experiment in an automated, unattended mode allows the user to run an effective screening exercise with direct analytical measurements being taken by the mass spectrometer. If the user can review the mass spec data in real-time, they also have the opportunity to adjust and optimize the reaction parameters of the flow reactor, in real-time, based on specific goals.

 

From bench, back to bench?

A final consideration of automation in the (bio)pharmaceutical development pipeline is that of the data generated and analysis required.

It is almost impossible to say that hand-written experimental records contain no inaccuracies. The increasing use of the electronic laboratory notebook has allowed for higher standards of data recording, and, now, direct capture of raw data is often expected as the minimum acceptable threshold. This also gives the ability to quickly and correctly trace analysis back to the original data and can be used to provide a clear understanding of how the analysis was done.

Laboratory automation also allows significantly deeper insights into experimental processes. Consider a 24-hour chemical reaction conducted in an automated system. Such a system will simultaneously control, measure, and record many reaction parameters, including reactor and sample temperature, volumes and timings of added reagents, and stirring conditions. Any deviances from the set protocol, for example decreased stirring speed due to increased sample viscosity, would be highly apparent and may immediately be brought to the attention of the researcher or technician. This would not be at all possible with manual data recording.

Finally, there are also benefits to automating data analysis. Data analysis is often conducted at the end of an experiment, but it can also be run while data is still being acquired. The latter case is particularly useful to indicate when a specific set of sample conditions has been reached, such as a percentage conversion. The sheer volume of data available from automated systems means that analysis can provide detailed insights that can then be used to optimize the experimental approach for the next run (4). The flow of information around the experimental set-up is shown in Figure 1.

Lab automation in a COVID-19 world

The research community needs to adjust to working safely with the threat posed by COVID-19. Effective automation will be critical to these new working practices to allow for the same throughput of tests with limited human resources. Although some research can be done in silico, many scientific studies can’t be done by home working and require laboratory experimentation.

Automation and good interface design support this new paradigm in three key respects:

  • Automation reduces the amount of time a researcher spends in the laboratory in physical proximity to colleagues through remote operation and monitoring of automated systems. This is likely to be a key requirement for safe working practices in many locations.

  • The speed at which experiments can be conducted, and the depth and accuracy of data generated will be crucial to maintain and accelerate the efficiency and speed of the modern pharmaceutical laboratory. The pressures of a crisis like the COVID-19 pandemic are only likely to increase the expectations on levels of output.

  • Well-designed, user-centric UX makes it more likely that the laboratory researcher will “do the right experiment and do the experiment right.” That is, successfully perform an experiment the first time, leading to fewer unnecessary repeats and ultimately increased productivity.

There can be little doubt that advances in automation will continue to unleash the potential of the scientific community. It is expected that laboratory hardware will continue to evolve, and that this evolution will empower ever greater insights. Great hardware also requires great control software-written with UX as a key consideration. 

References

1. S. Hayward, “Digital Data in the Lab – What Now?” Labmanager.com, Oct. 8, 2017.
https://www.labmanager.com/laboratory-technology/digital-data-in-the-lab-what-now-2756
2. J.R. Genzen, et al., Clinical Chemistry. 64 (2) 259–264 (2018).
3. P. Zhao, et al., J Am Chem Soc. 140 (21) 6661–6667 (2018).
4. V. Rosso, et al., Reaction Chemistry & Engineering. 4 (9) 1646-1657 (2019).

About the authors

Paul Orange*, porange@helgroup.com, is chief marketing officer, and Joe Willmot is applications leader; both at H.E.L Group. Will Fazackerley is designer at Mettle-Studio and visiting lecturer in Design, Brunel University.

*To whom correspondence should be addressed.