OR WAIT null SECS
Cynthia A. Challener is a contributing editor to Pharmaceutical Technology.
The evolution of cell-culture technology is driving the need for improvements in modeling solutions.
Editor’s Note: This article was published in the September 2019 issue of Pharmaceutical Technology Europe.
Modeling and simulation are recognized to provide assistance in predicting bioreactor performance, both for initial laboratory runs and when scaling to the pilot plant and commercial manufacturing. “Bioreactor performance has consistently improved in biomanufacturing due to a greater understanding of the engineering and biology of cellular systems. To continue to drive efficiency and reduce cost of manufacturing while maintaining or improving quality, reducing bioreactor variation will be required, and predictive models can enable these variation reductions,” asserts Tom Mistretta, director of data sciences at Amgen.
The continual development of higher-producing cell lines, the switch to chemically defined media, and other advances in cell-culture technology are, however, challenging developers of modeling and simulation software.
Typically, bench-top bioreactors are operated in the performance design space of larger bioreactors to model scale-up performance, according to Parrish M. Galliher, chief technical officer for upstream and founder of Xcellerex, a GE Healthcare Life Sciences business. “Successful modeling of bioreactors can reduce costs associated with engineering testing and engineering runs, accelerating pathways through preclinical and early clinical phases,” adds Mark T. Smith, staff engineer for single-use technologies in the BioProduction Division of Thermo Fisher Scientific.
In particular, the inexpensive computing power and excellent computational talent entering the biopharma industry, combined with improvements in computational fluid dynamics (CFD) and metabolic modeling software, are promising to open new doors for more complete cell culture models, according to Smith. “I predict that as modeling becomes more robust, it may be possible to entirely eliminate traditionally performed engineering runs, which in fact is already being done to some extent because in many cases companies and engineers simply ‘get to know’ their bioreactor networks empirically and heuristically, without the need for complex physical modeling in silico,” he asserts.
The key to successful modeling for the prediction of bioreactor performance, stresses Galliher, is to first calculate the performance design space of the bioreactor at all relevant scales. “Once these design spaces are understood, the scientist can operate any scale bioreactor within the design space of any other bioreactor scale and model any bioreactor of any scale using a bioreactor of any other scale,” Galliher says. He adds that this approach has been used effectively for decades.
The process of modeling has, in fact, advanced due to the increased capability to aggregate various sources of data (sensor measurements, batch record entries, product quality measurements, maintenance records, etc.) needed to model bioreactor performance, according to Mistretta. “The industry has become more efficient due to the significant improvements in technology for centralized access to various data sources via data lakes,” he observes. In addition, the information systems technology used has been validated and demonstrated to be cost effective, secure, compliant, and scalable with the volume of data generated from manufacturing processes.
In the past, continuous processing in perfusion mode was generally limited to cell-culture processes involving sensitive products that degrade under conventional batch conditions. Today, there is increasing use of perfusion cell culture to realize other cost, efficiency, and productivity advantages.
Some of the challenges with perfusion processes include sensor instability and fouling over the course of the culture, which impacts the ability to effectively monitor and control these processes, according to Galliher. Perfusion mode can also create additional risks of contamination due to the continuous removal of depleted media and introduction of fresh media, and these processes involve a significant increase in the media volumes required, which has logistical and cost implications for a manufacturing facility, adds Mistretta.
Overall, perfusion mode generally places higher performance demands on the reactor, according to Smith. He points to the higher cell densities in perfusion mode, which result in increased oxygen demand in a higher viscosity solution with potentially more cell-free biopolymers and lipids than a fed-batch culture. “Viscosity and solution property changes can fundamentally change mass-transfer mechanisms by impacting bubble size, bubble coalescence, mixing times, shear rates, and other parameters,” he explains.
As a result, in-silico modeling of systems must be adjusted for such parameters in a meaningful way. There is, however, a limited understanding in how to model such high cell densities due to the relative newness of these operating regimes, according to Smith. Computational models of perfusion processes additionally require consideration of filter effects such as fouling and sieving, Mistretta notes.
The shift to single-use technologies has also introduced a wider diversity of bioreactor designs, particularly with respect to spargers, baffles, and impeller configurations. “Whereas stainless-steel reactors could be made with relatively similar configurations within and across bioreactor networks, having multiple vendors represented in single-use bioreactor networks increases the level of data and modeling required to successfully predict behavior,” Smith explains.
In addition, Smith notes there is a need for improved mass transfer performance with reduced shear in single-use bioreactors. “As cultures become more intensified, the need for oxygen transfer increases and is generally accommodated by increasing the oxygen gas passing through the micro-sparger. In many cases, the increased flow through the micro-sparger leads to shear-associated cell damage, likely related to increasing shear at the gas-liquid interface,” he explains.
To address this problem, oxygen delivery must be achieved through efficient spargers with lower gas-liquid shear rates. “While some vendors already offer such sparger technologies, the industry has been slow to update to these sparger technologies, likely due to the revalidation burden of changes to established bioprocesses,” Smith comments.
Mistretta agrees that variability of single-use systems does impact cell culture, but Amgen has been able to rapidly detect and determine the root cause for this cell culture variation through the accessibility of process data, material lot data, and process performance models. “These models also support data-driven decision making for impacted lots, and we continue to develop monitoring strategies through predictive models, near-real-time monitoring, and material-control strategies to mitigate unintended variation in performance,” he states.
More advanced use of hybrid models such as CFD combined with first principles kinetic models are also effective to assess the impact of bioreactor design on performance, according to Mistretta. “Use of CFD models to determine kLa [mass transfer coefficients], mixing time, and shear effect have become pretty standard applications for technology transfer, scale-up, process optimization, and troubleshooting activities at Amgen,” he says.
In addition, Amgen has demonstrated the utility of finite element models for studying operational risk factors for single-use bioreactor systems. For instance, models were developed to predict how material science aspects of the single-use bioreactors impact pressurization and other operational risks, according to Mistretta. Amgen has also observed more collaboration with single-use bioreactor suppliers with respect to exchanging data and developing models that enable more reliable operations.
Modeling of the performance of the whole bioreactor including predicting full cell-culture performance seems to be the holy grail, but according to Smith, broad application of complete models has yet to be achieved due to to the complexities of the variables required for accuracy, including the oxygen transfer rate, oxygen uptake rate, carbon dioxide evolution, carbon dioxide stripping, metabolism, cell-line traits, etc.
In addition, Smith notes that improvements in understanding of cell biology have led to dramatic changes in what a “typical” cell line is capable of producing, with improvements coming from cell-line traits such as gene promoter strength, gene integration site, integration stability, modified metabolic pathways, etc. Furthermore, aspects of bioprocess engineering have shifted, for example from natural to chemically defined media and from batch to fed-batch (and toward continuous) modes.
“In short,” Smith concludes, “modeling the ‘complete’ bioreactor cell-culture performance for universal application may be not yet be in sight.” He does comment that there appears to be a push among large biopharma toward the use of complete models for the prediction of bioreactor performance in their given networks. However, in any published literature regarding these efforts, there is rarely sufficient detail provided to enable easy reproduction of the models used, and almost never is source code made public for direct application. “In this sense, trade secrecy and intellectual property concerns may be stymying the potential rapid industry-wide advances in performance prediction,” Smith observes.
“Even simpler aspects of bioreactor performance are generally still poorly modeled.” He points to the modeling of mass transfer coefficients, which dictate whether a reactor can provide sufficient oxygen to maintain viable cell mass, as an example. “In the past three years, there have been several hundred publications on CFD-based prediction of kLa in bioreactors. While this quantity suggests great interest in the area, they frequently describe the resulting models as ‘adequate,’ ‘sufficient,’ or similarly hesitant verbiage, suggesting in fact there is still space to improve in our performance prediction as an industry,” Smith explains.
Potentially because of this remaining gap in the predictive power of CFD, many of the product development/pilot-scale models used today, according to Smith, are driven primarily by empirical data leveraging fairly traditional correlations. “This approach seems to be able to provide a similar predictive power to that of CFD without the additional computational effort and resources,” he comments.
Useful models must be robustly validated with real data. The greater the amount of good data incorporated into a model, the more insight the model can provide. Traditional off-line sampling once or twice daily of key parameters (e.g., viable cell density, glucose, key amino acids, titer, etc.) will not provide the robust data set needed to robustly predict performance on a moment-by-moment basis, according to Smith.
That is where process analytical technology (PAT) comes in. “Relatively recent application of PAT, such as Raman spectroscopy and biomass probes for inline sensing, are now providing more complete data on cell-culture performance and enabling better, more timely control of reactor automation,” he observes. Advances in sensor technologies have also contributed to enhanced real-time performance predictions of bioreactor performance and enable approaches such as model predictive control-based models, according to Mistretta. It is, in fact, improvement of automation control strategies within the existing capabilities of the bioreactor-not sampling and sensing alone-that is essential to improving bioreactor performance, stresses Smith.
Further advances in PAT are still needed, however, to enable them to really contribute to model development, according to Galliher. As an example, he notes that while infrared sensors have been touted as being able to measure relevant bioreactor parameters and generate predictive models, they require large computing power and algorithms to cancel out interference from cells, gas bubbles, and media components, all of which change over the course of the run.
The availability of data (particularly genomic and proteomic data on cellular systems) and the capability to efficiently capture, store, and contextualize the process data needed for modeling have been the most notable advances in predictive bioreactor performance, according to Mistretta. The decreasing cost of the infrastructure needed to run computational models has, he adds, also made it faster to prototype and deploy predictive models for bioreactors.
One area of rapid development involves the potential application of artificial intelligence (AI) and machine learning (ML) to bioreactor performance modeling. Bioengineers are beginning to discuss how to apply machine learning and digital-twin algorithms to the modeling of bioreactor performance, coupled with CFD modeling of multiphase (liquid and gas) solutions, according to Galliher. He does note, though, that these approaches are in conceptual stage.
In fact, the capability to design, develop, and implement machine learning algorithms more efficiently has also allowed modeling to be done in a more systematic way, according to Mistretta. “Machine learning models based on the ability to leverage larger datasets have enabled automated data-driven models of bioreactor performance and offer the potential to help determine fitted parameters in first principles models of bioreactor performance,” he remarks.
One possible challenge to leveraging AI/ML could be the need for massive data sets in order to avoid overfitting issues. “Considering that biopharma companies are generally trying to minimize the number of culture runs for any given product and that even processes for blockbuster products may only be run 12–20 times per year, there is a relative dearth of information for most cell-line cultures with respect to typical ML data sets,” Smith explains. While each run produces thousands of data points (e.g., second-by-second pH, dissolved oxygen, viable cell density, etc.), these variables are inextricably linked to the chronology of the culture and could lead to further bias, he adds.
Smith stresses, though, that although AI and ML pose unique challenges to bioengineers, it is possible that experts well-versed in how and when to apply AI may develop creative solutions for deconvoluting and dealing with these types of challenges in the near future, to successfully enable the utilization of these technologies within the bioprocess industry.
Today, most of the measures of bioreactor performance are at a macro level in commercial facilities, according to Mistretta. To get to highly accurate predictions of bioreactor outputs, additional measurements and a framework for extensible modeling that describes the physiology, chemistry, and mechanics of a bioprocess must be developed, he says. “Genomics and metabolomics-level understanding of bioreactor performance supplemented by real-time advanced sensors to measure post-translational modifications are necessary, and we predict to see these types of advances in the next three to five years,” Mistretta asserts.
Pharmaceutical Technology Europe
Vol. 31, No. 9
When referring to this article, please cite it as C. Challener, “Cell-Culture Advances Test Bioreactor Performance Models," Pharmaceutical Technology Europe 31 (9) 2019.