OR WAIT null SECS
Agnes Shanley is senior editor of Pharmaceutical Technology.
As biopharmaceutical process development advances, a more collaborative and knowledge-based approach, driven by better analytics and high-throughput system integration, is reducing the risk of failure.
For years, pharma’s key opinion leaders have emphasized the need for deep process knowledge to inform new product and process development. The concept may have taken some time to catch on in the industry, but today, more biopharmaceutical development and manufacturing programs are showing what is possible when ideas such as pharmaceutical quality by design (QbD) are combined with better analytics and automation, as well as with equipment that reflects the principles of process intensification.
These abstractions are taking shape most notably in process development, which has become inextricably linked with scale-up. More development projects now start off with clear scalability goals in mind. Developers are collaborating more closely with technology vendors, contract development and manufacturing organizations (CDMOs), and suppliers to achieve these goals. Among the greatest challenges that CDMOs and technology developers agree on are:
Automation is driving improvement, according to Joe Makowiecki, director of business development for Cytiva’s FlexFactory and KuBio product lines. He sees the benefits becoming more visible as pharma moves to more real-time monitoring and control, and to applying the principles of Pharma 4.0. “Automation will lead to development of processes that are manufacturing-ready in less time,” he says.
This trend, in turn, should feed continuous improvement. “As they are scaled up, processes will yield much more information, especially on the interdependency between process parameters, allowing development to move faster and to focus only on experiments that will add the most value,” Makowiecki adds.
Technology vendors are incorporating automation and adopting new approaches to help companies achieve these goals. However, they must be careful to balance the need for more data with the need to keep systems from becoming too complex, says Loe Cameron, director of analytics and controls at Pall Biotech.
“Like many others in the industry, Pall is on a Pharma 4.0 journey, assembling solutions and assessing the impact. Our research is focused on how we can provide tools in a way that does not negatively impact the usability of our systems, and we work closely with users of these systems to reach that goal,” Cameron says.
Pall has installed automation systems in its process development labs that mimic many of the functions found in full-scale manufacturing. “This capability allows customers to project the future of their process as it scales more effectively,” says Cameron. “We have connected this system to a centralized historian database, which has allowed us to more deeply explore how solutions such as soft sensors, modeling, and advanced process controls can improve viral vector processes at each scale, and over the full life of the process,” she says. However, Cameron notes, it is important to remember that taking these steps has a cascading effect. “As we install more sensors to support the automation of our processes, so much data are then created that the analysis must be automated as well,” she says.
Those involved in process development say they are approaching projects quite differently from the way they might have tackled them a few years ago. Broadly, the need for speed (if not to market, then to the next milestone or developmental stage) and early-stage robustness are key drivers, says Patrick McMahon, global operations manager for integrated services at Cytiva.
Biopharmaceutical process development teams are linking the drug discovery phase to process development and scalability early on, leveraging platforms and predictive technology, says McMahon. “This allows them to reduce project timelines, drive in quality and robustness, and reduce the need for repeat work and comparability for process or analytical changes,” McMahon explains.
Process development specialists see efforts as becoming increasingly collaborative. “Across the industry, we see more and more companies working together to overcome challenges. CDMOs are performing a wider range of services, and equipment manufacturers and media suppliers are also contributing to process development efforts,” says Jean-Christophe Drugmand, senior bioprocess architect at Univercells Technologies. He recalls working on projects where media optimization studies were subcontracted to the media supplier, and some QbD studies were also handled by specialists in that area to accelerate the overall project execution.
One fairly recent, and fundamental, challenge to bioprocess development has been the increasing complexity of biotherapies, as more companies pursue gene and cell therapies and multi-specific antibodies. “Historically, process development followed the philosophy that more product equates to a better process, but as molecular complexity (and with it, sensitivity) has increased, we must think about the balance between productivity and quality,” says Timothy Morris, group leader of manufacturing process development at Catalent Biologics. In this new environment, Morris says, process development teams must think about cell culture as the product of both the molecule and the bioreactor. For example, he explains, increased antibody sialylation is viewed as being representative of increased residence time in the body.
“Complex bioproducts require a different approach when it comes to developing robust and scalable processes, and we must focus on developing processes and analytical methods based on the particular characteristics of a molecule,” says Atul Mohindra, senior director for biomanufacturing, R&D, Lonza Pharma & Biotech. Companies are moving to reducing the risk of failure as early in process development as possible. Cell line development is becoming a critical area of focus, McMahon says. This had often been a problem in the past, as development efforts sped quickly only to find that the cell line selected simply did not work.
“Optimizing cell lines reduces risk early on. It also forces the focus from the immediate task at hand, whether feasibility or a clinical milestone, to longer-term success for scalability and regulatory compliance with foresight built in. Using the best quality cell culture media helps further improve critical quality attributes,” McMahon says.
“Understanding how the process will impact the product is very powerful,” he says, recalling a biosimilars development project that Cytiva had worked on with a client, which used predictive modeling to understand critical quality attributes (CQAs) such as glycosylation patterns, and better correlate them with critical process parameters (CPPs), earlier in development. As a result of these efforts, the team improved the glycosylation pattern and charge variance profile, yielding a molecule with quality attributes more similar to the innovator molecule.
Currently, early stage automation supports the generation of high producing clones capable of producing optimum quality molecules, says McMahon. In general, the use of automation not only improves the overall development process, but also clinical and commercial manufacturing.
“Improved early small-scale process models have become increasingly important, and tools to improve the insight we get from these and provide data earlier in the development timeline are proving vital to de-risk early decisions,” says Colin Jaques, technical director of R&D, with Lonza Pharma & Biotech.
Catalent uses its proprietary GPEx and GPEx Boost cell line development technologies to generate high yielding clones, Morris explains. “While clone selection is typically based on yield, we have matrixed in processes to evaluate clones for both yield and robustness. This approach combines the cell line technology with the ambr15/ambr250 bioreactors, and our advanced analytics to support a robust scalable strategy,” he says.
Morris recalls one program in which multiple high-yielding clones came off the Berkeley Lights Beacon clonal selection system. “We were able to take many of those clones into our microbioreactors. However, if we had picked candidates solely based on yield, the third-ranked clone would have been missed, even though it increased its yield at the CGMP scale. This clone has been robust in our failure mode process, whereas the top two showed sensitivity to known failure modes that may not occur in the CGMP process,” he says. As this case clearly shows, a robust process relies on a robust clone, so selection is crucial, Morris adds.
High-throughput analytics makes key data (e.g., CQAs) available for decisions at this stage of cell line construction. “Mathematical models are essential in dealing with the volume of data arising from high throughput scale down systems,” Jaques explains. During the early stages of process development, more high-throughput screening (HTS) and design of experiment (DoE) approaches are being integrated, says Drugmand. He sees more process development teams leveraging mathematical modeling and advanced analytical technologies. “This approach helps increase our understanding of the process and reduce costs through automation while contributing to better process control,” he says.
In addition, Drugmand says, process development teams are routinely using statistical analysis and process analytical technologies (PAT) to review processes and results. “It’s now a common component of process development initiatives, which is a change from the past,” he says. Process modeling tools are also being explored for scalable DoE options, and scale-down models are being used to streamline process development and scale-up.
Catalent’s bioprocess development team has created many product specifications based on historical information on the molecule or type of molecules involved, says Morris. “We take in-process samples to ensure that we understand how process changes affect the product,” he says, noting use of systems such as ambr 15 and ambr 250, which allow the team to perform a DoE to reduce the chances of failure early in the development process.
“The availability of high throughput, high-fidelity, small-scale bioreactor models has facilitated the use of response surface optimization approaches to process and medium development,”says Jaques. “Improvements in high-throughput product characteristics assays have enabled optimization techniques to be broadened out to more process outputs,” he says, “while response surface methodology allows propagation-of-error techniques to be used to improve process robustness in those platforms.”
Lonza’s team routinely uses small-scale ambr systems, Tecan robots, and high throughput analytics to select clones, screen purification resins and to identify optimal conditions, adds Mohindra. In addition, the company is installing more in-line process measurement technology, which he expects to further improve process control.
Process development specialists note a trend toward greater use of single-use technologies. “Biologics manufacturers are looking to these technologies to reduce capital expenditures and utilities consumption, increase flexibility, and reduce changeover times,” says Mohammed Elfar, product innovation manager at Univercells Technologies. Elfar notes that use of continuous processing is also increasing as a way to help increase productivity and reduce of goods sold. Yet the move to single-use unit operations has introduced technical challenges to process developers and scale-up teams. “Many modern sites can’t handle hot feed preparations, which can lead to challenges in developing concentrated feeds,” Jaques explains. “Furthermore, single-use bioreactors are often underpowered and undersized for gas flow, requiring that compromises be adopted for gassing strategies,” he says.
As high titers have become the rule upstream, recent years have brought a shift in focus from upstream to downstream process development. “Whole process considerations (instead of a focus on cell culture only) are helping to streamline efforts,” says Drugmand. “Unit operations are being intensified and integrated to optimize upstream and downstream work in parallel for overall process optimization,” he adds. “Process development scientists and engineers are being challenged to cope with larger harvests. This is especially true in the case of monoclonal antibodies, where dramatic titer increases have been observed over the years,” says Elfar.
There is also a greater emphasis on the interface between upstream and the downstream processes, says Jaques. “Perversely, this trend has led us away from small-scale high-throughput models for platform process development because there are no representative primary recovery or purification tools at that scale,” Jaques says. This shift has involved measuring more biological properties of the harvest materials and the development of models to understand the impact of the parameters on the quality of the feed stream going into and coming out of the downstream process, he explains.
Clearly, bioprocess development, like scale-up, has become more complex and challenging for the scientists and engineers who drive these efforts. However, new technologies, acquired process experience, and better access to relevant historical data are allowing biopharma teams to make the overall operation more systematic and less risky.
Vol. 44, No. 11
When citing this article, please refer to it as A. Shanley, "Bringing Bioprocess Development into Focus," PharmTech 44 (11) 2020.