Consistent API Quality Calls for Collaboration

Bioprocess understanding, the right equipment, and automation help, but multifunctional teamwork is the key to API production success.
May 02, 2018
Volume 42, Issue 5, pg 24–29

Martins Melecis/shutterstock.com Biologic drug substances are large, complex molecules prepared from living systems via complex cell-culture or fermentation processes. Given this complexity, it can be difficult to achieve consistent and predictable performance, with numerous factors from raw material variability to operator training to equipment selection influencing process outcomes. Biopharmaceutical manufacturers continuously seek new manufacturing technologies and approaches to improve process robustness and enhance product consistency and quality. Recent pursuits include continuous processing, the implementation of process analytical technology (PAT) and automation systems, the adoption of single-use equipment, development of enhanced modeling capabilities, and leveraging advances in data management and analysis. In the end, however, maximizing the benefits of these new approaches and technologies requires extensive collaboration within multifunctional teams.

The people, process, and product matter

Uniform production of biologic drug substances from batch to batch requires consideration of three key attributes, according to Elise Mous, director of sales and marketing and business development for Capua BioServices: the people involved, including their training, know-how, experience, and flexibility; the process itself, because each has its own unique challenges, characteristics, and requirements; and the desired product, for which the final specifications determine the end-game.

 “The first step is to find the right approach needed to orchestrate all of the necessary requirements in the right manner to ensure a successful, scalable, and robust recipe,” she says. Those requirements include expression strain requirements/characteristics; raw material requirements; plant/equipment design needs; automation needs; training requirements; utilities requirements; key process parameters; regulatory constraints/requirements; and environmental, health, and safety constraints/requirements. For contract development and manufacturing organizations (CDMOs), much of this information can be learned either from past experience at the plant or from the client.

Attention to details can also make a significant difference in the robustness of a process, according to Mous. “It is important to analyze the potential risks and uncertainties upfront and define a custom onboarding trajectory for transfer, development, scale-up, and manufacturing campaigns, which includes definition and agreement of the joint objectives and journey required to achieve them,” she observes. Development of scalable, robust technologies as early as possible and establishment of a sampling plan and analytics are also important. Finally, attention should be paid to continuous improvement so that learning from the process and the data is ongoing. “Of course,” she adds, “safety and quality requirements can never be compromised.”

So does the amount of data

Because CDMOs work on both long-term commercial and short-term clinical and commercial projects, the amount of available process data can vary significantly. Fujifilm Diosynth Biotechnologies tailors its approach to uniformity- and consistency-control based on the amount of data that is gathered, according to Abel Hastings, director of process sciences at the company’s North Carolina facility.

 “For processes with long histories and substantial data sets, we can leverage traditional statistical control. For processes with smaller datasets, we aim to improve early responsiveness to process control and consistency monitoring by applying a logic-based, piece-wise approach,” he explains. This methodology requires three main steps: process parameter to attribute linkage, development of an ideal distribution of expected data, and comparison of data against expectations.

Mapping out the expected process parameters to attribute linkage can form the basis for determining which data are most valuable to not only confirm that the process is running reliably, but also ultimately lead to tighter process control, even in circumstances where data are limited, according to Hastings. “This approach can also help teams focus on ways to capture the most informative information,” he adds.

Once parameter-attribute linkages are established, Fujifilm’s teams discuss and document the expected data results distribution. For example, they consider whether a normally distributed histogram or a sine-wave distribution should be expected, and whether the data might be skewed. “Developing expectations based on the factors and control elements can help teams establish a basic understanding of what ‘normal’ should look like,” Hastings explains. Once actual data are generated, they are compared to the expectations. “Even in the absence of statistical significance, this approach can allow processes to move toward increased process control and uniformity early in their lifecycles,” he observes.

 

And how it is acquired

Effective application of that actual data is essential to enabling meaningful interpretation, Hastings adds. “Most major equipment manufacturers have developed strong and deep data systems that gather information to help ensure reliable operation. In some instances, however, more data does not lead to more knowledge,” he asserts.

Because CDMOs work on both long-term commercial and short-term clinical and commercial projects, the amount of available process data can vary significantly.

To ensure the right interpretation can be made, Fujifilm recommends assessing each data acquisition against the corresponding parameter-attribute pair. “Teams should aim to identify any points of ambiguity. Common points of concern can result from misalignment between parameters and attributes, limits of resolution in data, and data acquisition frequency,” Hastings explains.

An example of a parameter/attribute misalignment would be the difference between a temperature requirement for a shake flask operation and data recording of the temperature of an incubator. “While the temperatures may be the same, they are not the same element,” he says. Issues with data resolution can arise with many digital read-outs, which may clip or round data, potentially resulting in over-interpretation or under-appreciation of the degree to which the data may be skewed.

Finally, Hastings notes that it is important for users to recognize that automated data systems recorded at intervals may not give an accurate reflection of processes that change at higher frequency. “Teams should consider the rate of change of a process step before selecting a sampling frequency,” he says. To avoid these problems and similar risks, Fujifilm strongly recommends a cross-functional review of processes--from scientist to automation engineer--in order to ensure all data risks are considered.

Automation helps to ensure consistency by minimizing human errors and enabling repeatability and reliability of operations from one batch to another.

Understand your equipment needs

In addition to types and amount of data and the way they are collected, the selection of biomanufacturing equipment can have a substantial impact on process robustness. Which aspects are most important can vary from one process to another, according to Gerald Hofmann, head of technology development at Capua BioServices. In addition, design attributes to be considered should not be limited to the performance during the actual cell-culture or fermentation process. “The hygienic design of the equipment, its maintenance needs, and what is required for an effective cleaning regime must also be well understood,” he says.

Perhaps most importantly, selecting the equipment design that will provide the most robust process cannot be achieved without proper understanding of the full process requirements, according to Hofmann. “Making sure that these process requirements, and the needs of the user, are also well understood by equipment manufacturers can also have a measurable impact,” he comments. New equipment should therefore be selected to fit for a defined purpose in accordance with user requirements and critical process parameters.

When working with existing equipment, it is important to understand the opportunities and limits presented by the design and the potential for any modifications that can make the equipment more fit-for-purpose. Appropriate installation qualification, operational qualification, and process qualification protocols should also be prepared, reviewed, approved, and followed; and a proper maintenance plan should be developed and implemented, according to Hofmann.

 

Develop an automation strategy

Automation helps to ensure consistency by minimizing human errors and enabling repeatability and reliability of operations from one batch to another. Increasing adoption of automation and PAT in biopharmaceutical manufacturing is intended to reduce drug variability, increase yield, drive down costs, and maximize safety.

As CDMOs, both Fujifilm and Capua BioServices have a unique perspective because they work with a spectrum of processes, including products undergoing their first at-scale batches, decade-old processes being relocated to another facility, and commercial processes with a long history of continuous improvement. “The strategy necessary to ultimately implement automation ideals can often be challenging for processes striving to simply produce clinical batches,” says Hastings. “It is beneficial to find the right balance between manual versus automated operations; continuously evaluating investment versus added value,” adds Mous.

“When installing new processes into our facility, we aim to consider the full lifecycle of the process while also meeting the short-term goals of the project. To do so and overcome some of the challenges with implementation of automation, we design our process development work to simulate what can be both implemented as a manual operation and be transitioned into an automated step,” says Hastings.

FDA’s Manufacturing Science and Innovation Center of Excellence

In response to the growing complexity and globalization of biopharmaceutical manufacturing, in 2017 FDA’s Office of Pharmaceutical Quality (OPQ) established the Manufacturing Science and Innovation Center of Excellence (CoE) “to promote internal and external scientific collaboration in manufacturing science, facilitate research communication and management, and advance OPQ’s research culture and capabilities in manufacturing science” (1). The CoE is intended to build on FDA’s history of research on bioprocessing issues, particularly viral clearance, quality by design, bioreactor control, and unit operation linkage.

The CoE’s mission is to “promote biopharmaceutical science and innovation by addressing cross-cutting science and regulatory issues in bioprocessing through research that supports guidance and policy development,” which it will accomplish by identifying technology gaps in bioprocessing methods and directing research to address them, with a focus on improving robustness and eliminating potential failure modes association with common unit operations. Some currently perceived gaps include real-time monitoring of protein critical quality attributes, adventitious agent control, viral clearance robustness, and the linkage between upstream and downstream processes.

FDA also believes that the CoE can help address industry competitiveness. In addition, because advances in manufacturing technologies can simultaneously improve efficiency/productivity and create regulatory challenges, the CoE will enable FDA scientists/reviewers to better understand new technologies as they are being developed. As a result, the agency will be better positioned to evaluate their potential impacts on quality and safety.

Reference

1. K. Brorson and S. L. Lee, “OPQ Establishes Manufacturing Science CoE,” PDA Letter, Jul 10, 2017.

Cross-functional teams are essential

Underlying all activities related to the development of robust processes for the production of biologic drug substances is the need for collaboration among scientists and engineers involved in all aspects of a project. “Selection, training, and coaching of the project team, from the operators to the process engineers to the plant engineers to the analytical scientists, is essential to achieving effective processes. Success is not possible without such a multidisciplinary approach,” asserts Mous.

In many cases, finding the right solutions requires a bit of creative thought and collaborative work between the process development teams and manufacturing and/or quality-control operators, agrees Hastings. “We strongly recommend that cross-functional teams spend time in one-another’s area in order to build a solid understanding of flexibility needs, constraints, and long-term opportunities,” he comments.

Through this type of collaboration, it is possible for teams to find the right balance between speed, risk and reward, and cost, according to Mous. “In some cases, a project can be started quickly with a higher risk, while in others, investment in proper preparation upfront is the more appropriate approach,” she explains. Cross-functional teams are also better able to determine whether making facility or process modifications is more practical. “These teams know how to look for the best fit in terms of not only the best fit with the process, but the best fit with the budget,” Mous concludes.

Article Details

Pharmaceutical Technology
Vol. 42, No. 5
May 2018
Pages: 24–29

Citation

When referring to this article, please cite it as C. Challener, “Consistent API Quality Calls for Collaboration" Pharmaceutical Technology 42 (5) 2018.

native1_300x100
lorem ipsum