OR WAIT null SECS
Advances in simulation and the development of digital twins.
The global response to the COVID-19 pandemic has triggered an unprecedented demand for faster and more flexible biopharmaceutical manufacturing solutions. While safety and quality are of primary importance, efficient scale-up and tech transfer capabilities are critical. Prominent drug product development and rollout campaigns have struggled due to slow scale-up and technology transfer challenges, according to an article in The Wall Street Journal(1). The development of cost-effective and good manufacturing practice (GMP)-compliant technology platforms is needed to move lab-scale scale discoveries to industrial-scale production environments.
In principle, process scale-up and an optimized manufacturing process can be achieved using a trial-and-error approach guided by empirical correlations. In practice, this approach is time-consuming and can introduce production delays that could last anywhere from months to years. Beyond the time requirement, process optimization requires specialized laboratory skills and expensive drug product materials. Although experimental design correlations can be used to assist in process development, few are tailored for the unique geometries used in biopharmaceutical manufacturing environments. They also fail to capture local phenomena in large-scale unit operations.
In recent years, to accelerate process design and optimization, computational fluid dynamics (CFD) has emerged as a parallel tool for characterizing fluid mechanics and transport physics within biopharmaceutical manufacturing processes. Like an experiment, the purpose of a CFD simulation is to develop mechanistic links between operating conditions and process outcomes. Unlike physical experiments, however, CFD simulations present an in-silico approach to performing numerical experiments. If performed using a proper level-of-theory, the quality of the predictions from numerical simulations can rival measured data. The speed at which it can be generated using the simulation, however, presents an order-of-magnitude improvement over experimental approaches and at significantly less cost.
In the late 1960s, the concept of a “digital twin” was introduced by NASA as part of the Apollo program. Researchers developed two identical space vehicles, one of which was used as a “twin” to simulate the real-time behavior of the counterpart that was sent to space (2). Coincidently, it was during these times the first CFD algorithms were developed to model flow over airfoils, ship hulls, and aircraft fuselages. These algorithms, in a sense, were sophisticated calculators that evaluated a time-averaged solution to the fluid velocity and pressure field across a simulation domain. Relevant output included the time-average shear stress/strain field across the fluid as well as the average force on any solid objects with the domain. The success of these models to aerospace and naval hydrodynamics led to the commercialization of many CFD packages in the 1980s and 1990s, which were initially targeted to users in the automotive, defense, and energy industries.
From 1990 to 2010, commercial CFD packages trickled into the pharmaceutical and biopharmaceutical industries, according to a 2017 article in Microbial Biotechnology (3). Many of the initial applications focused on predicting impeller power numbers, calculating the maximum shear stress in a vessel, and examining flow fields through piping networks and tanks (3). Models were later extended to include predictions of blend time and residence times, using the mean-flow field as input to the scalar advection equation (3). The applications of CFD also extended to rotor/stator systems, chromatograph columns, and orbital shakers. For each individual process test, the cost savings from using CFD in the experimental design space were estimated to be between $500,000–$1 million (4).
Although these time-average models provided sufficient fidelity for single-phase/single-fluid systems, extending the models to handle the multi-phase, transient, and multi-fluid systems typical of the pharmaceutical manufacturing process proved difficult. For example, because these original CFD implementations were focused on steady-state flow fields, they provided little insight into the turbulent structure and flow field driving reactions, mixing, and cell damage processes (4). Additional challenges were presented by topologically complex systems, which often required CAD geometry clean-up and careful volume meshing. Moreover, although some academic circles could extend these traditional CFD tools to handle multiphase flows, the industrial application of CFD to generalized bioreactor design and process simulation was not widespread. Thus, although time-averaged CFD could provide insights into aspects of a process, it was not a practical tool for simulating processes.
In the 2010s, two important trends converged to present a step-change in process simulation capabilities. The first trend, as motivated by advances in transport physics theory, was the development of new algorithms for solving the fully transient fluid flow and particle transport equations (5, 6). These modern algorithms, which are based on the Boltzmann transport equations, provided a multiple order-of-magnitude speedup in runtime relative to approaches applied in earlier commercial algorithms. The second trend, as motivated by advances in computer hardware, was the development of graphics processing units (GPU) dedicated to scientific computing. Originally developed for graphics rendering, GPUs are highly parallelized computing architectures that present a superior paradigm for machine learning, data analysis, and artificial intelligence applications. For the modern CFD algorithms developed in the past decade, a single desktop GPU can model physics faster than hundreds (or thousands) of CPUs operating a parallel (7).
The combined effects of these algorithmic and hardware advances have presented an important advancement in process simulation. In contrast to time-averaged single fluid calculations, these modern approaches can be used to run fully transient, time-accurate three-dimensional process simulations. The physics in these simulations are fully coupled such that changes in rheology due to blending inform real-time changes in the fluid flow, or the combined effects of individual bubble break-up and pair-wise coalescence events inform fluid flow and mass transfer in an agitated bioreactor. These approaches enable the development of time-accurate digital twins with a self-consistent linkage between fluid flow, species transport and free surface dynamics, and drug product growth. In this sense, entire processes can be developed, transferred, and/or scaled up entirely in silico (8).
Earlier this year, researchers demonstrated how a digital twin running on a multi-GPU workstation could reproduce the real-time blending properties of a high viscosity/low-density buffer excipient solution within a water-like drug substance solution (5). Because the flow field evolves in time due to ongoing changes in the fluid viscosity, the notion of a time-averaged flow field is not appropriate. Using a three-dimensional time-accurate simulation, however, the coupling between blending and fluid flow was immediate and the predicted blend time was in line with measured data at multiple impeller speeds. Following this validation, the twin could be used to optimize how the excipient solution could be added to the solution to minimize blend times and how the process would scale at different operating volumes. Using the digital twin, process scale-up strategies were identified in hours/days as opposed to the months/weeks typical of empirical testing.
In a Chemical Engineering Science article, researchers used a digital twin—also running on a multi-GPU workstation—to predict the oxygen transfer rates of sparged bioreactors (6). Agreement between the predicted and measured data was realized across 5 L to 2000 L working volumes with no model reparameterization or recalibration between scales. Using this model, researchers could identify variations in mass transfer rates across the vessel and how these lead to variations in dissolved gas concentrations across different scales. The success of the model across these scales and operating conditions was linked to the fundamentality of the physics evoked during development. The computational requirements of this modeling approach were, admittedly, beyond the capability of steady-state solvers. When these physics are solved using algorithms tailored for GPUs, however, they provide mechanistic insights into the link between operating conditions and oxygen transfer rates.
As the past year of vaccine production has demonstrated, the need for more efficient process scale-up and optimization strategies is pressing within the pharmaceutical and biopharmaceutical manufacturing industries. The use of CFD-based digital twins to enable smart manufacturing is emerging in different phases of process development, process prediction, decision-making, and risk mitigation. A key component in the development of this class of digital twins is the ability to reduce simulation time to achieve near real-time computations at a relatively low computational burden. Reduced order modeling approaches and hybrid modeling (e.g., artificial neural network + CFD) seem promising but can be computationally complex and may not be suited for wider application in the industry.
Fortunately, in the past decade, important advances in modeling algorithms and hardware architectures have enabled a step-change in the fidelity, utility, and efficiency of the CFD models for addressing these needs. These models present researchers with the ability to build real-time, three-dimensional digital twins with output that can be validated directly against transient measured data. To help troubleshoot problematic unit operations, researchers can use these models to understand the underlying physics governing process outcomes. Perhaps more importantly, because the models are time-accurate, they can be used to run ahead of real-time and forecast possible problems in process design. In this sense, to minimize the number of physical experiments, an entire unit operation can be designed, tested, and optimized in silico. The approach presents tremendous reductions in development costs and schedules.
John A. Thomas is the president of M-Star Simulations.
Vol. 45, No. 11
When referring to this article, please cite it as J. Thomas, “Computational Fluid Dynamics in Upstream Biopharma Manufacturing Processes,” Pharmaceutical Technology, 45 (11) 2021.