OR WAIT null SECS
© 2024 MJH Life Sciences™ and Pharmaceutical Technology. All rights reserved.
Pharmaceutical Technology's In the Lab eNewsletter
Managing data at the different stages of the lifecycle, linking disparate systems together, and making the right data available to those who need it is problematic and time consuming.
The bio/pharmaceutical industry operates under an intricate system of regulations for a good reason: the protection of patient safety. Despite this, few argue against the fact that, in general, the industry lags far behind others when it comes to manufacturing practices and the speed of process development.
Regulations are often used as a justification for why the industry is so risk averse, but this is misleading. Regulatory agencies are not only open to new ideas such as continuous manufacturing, but in many cases, they are actively promoting them. The Center for Drug Evaluation and Research (CDER) in FDA, for example, created an Emerging Technology Program to promote innovative approaches to modernize the industry, including continuous technologies and novel dosage forms (1).
In 2011, FDA published an updated guidance document, Process Validation: General Principles and Practices, which replaced the previous 1987 version (2). The changes in the guidelines reflect an increased emphasis on a scientific, data-driven approach to process validation and a more holistic view of the product lifecycle, which includes the following stages:
The core of each stage is “the collection and evaluation of data…which establishes scientific evidence that a process is capable of consistently delivering quality products” (2). These guidelines are explicitly aligned with the risk-based approach outlined in International Council for Harmonization (ICH) Q8 (R2), Q9, and Q10 including quality by design (QbD).
Prior to this, process validation was generally considered a commercialization activity and typically involved running three batches to demonstrate that the process was reproducible and capable of delivering consistent product quality. The updated guidelines recommend the use of statistical techniques to demonstrate process capability and emphasize the importance of providing “sufficient statistical confidence of quality both within a batch and between batches” (2). It is well established that, for biologics, the process is the product; therefore, ensuring that the process is well characterized is essential for maintaining product quality.
It has been seven years since these updates were published, so how has the industry changed?
Most companies are using statistical software for design-of-experiments (DOE) and multivariate data analysis. There have been some advances in the application of QbD principles earlier in the development lifecycle as well as continued process verification (CPV) for commercial products. The value of bioprocess scale-down models for development and troubleshooting is recognized by both pharmaceutical companies and regulatory agencies, but the predictive capabilities of these models must be justified with real data. The core challenge of systematically capturing and analyzing data across the development lifecycle remains an obstacle to progress.
As anyone who has tried to develop or implement a system for managing bioprocess data knows, this is a complex challenge that requires linking a tangled web of relationships together across organizational boundaries. Software systems designed for research and development, such as electronic lab notebooks (ELNs), tend to be fundamentally different compared to systems designed for manufacturing, such as manufacturing execution systems (MES), for good reasons.
In the process design and development stage, scientists need flexibility to make changes as experiments and projects progress and also need the freedom to design and run different types of experiments, such as evaluating a new type of equipment or consumable. By the time the process progresses to process qualification for manufacturing, the emphasis shifts from flexibility to recording specifics such as operating conditions, procedures, and controls.
Finding and reusing data from previous development or manufacturing runs is also a challenge. In many cases the “glue” that links the different process validation stages together consists of Microsoft Excel spreadsheets and PowerPoint or Word documents. While most companies have invested in a variety of systems to manage data at the different stages of the lifecycle, linking those disparate systems together and making the right data available to those who need it is still problematic and time consuming.
Capturing process and analytical data in context needs to happen automatically in both development and manufacturing. Any time this requires extra work, such as manually updating spreadsheets or transferring data between systems, it increases the probability of missing or inaccurate data and compromises data integrity. The last thing companies want is for scientists to painstakingly go through batch records and transcribe data into the appropriate format to support CPV, but this is still happening.
On paper, implementing a system to effectively automate data collection across the process validation lifecycle sounds simple, yet the reality is anything but.
One of the biggest challenges is standardizing bioprocess data across development stages and technology platforms to make it more accessible. While a single unified data standard is a long way off, there are initiatives such as Allotrope for analytical data and BatchML/B2MML implementations of the ISA-88/ISA-95 manufacturing protocols that are showing progress.
Making the vision of a fully digitized process-validation lifecycle a reality will require collaboration from both biopharmaceutical companies and software vendors alike. Industry alignment such as the position paper on CPV produced by the BioPhorum Operations Group will be key to moving away from highly customized, point solutions toward more holistic, integrated solutions that the entire industry can benefit from (3).
The drivers for digitizing the process validation lifecycle are clear. It is a pre-requisite for continuous manufacturing, which many see to be the next big step for the industry. For products already on the market, knowledge acquired through both experimental and production runs can provide a mechanism to troubleshoot production problems as well as support post-approval changes. For new products, there are significant gains to be realized from speeding up the process of producing clinical-trial materials to either “fail faster” or ideally reach the market faster. This last benefit is the most important of all for those patients who are waiting for new biopharmaceutical therapeutics to become available.
1. S. Lee. “Modernizing the Way Drugs Are Made: A Transition to Continuous Manufacturing,” https://www.fda.gov/Drugs/NewsEvents/ucm557448.htm, accessed June 18, 2018.
2. FDA, Guidance for Industry, Process Validation: General Principles and Practices (CDER, January 2011), https://www.fda.gov/downloads/Drugs/Guidances/UCM070336.pdf
3. Biophorum Operations Group, “Continued Process Verification: An Industry Position Paper with Example Plan,” https://www.biophorum.com/wp-content/uploads/2016/10/cpv-case-study-interactive-version.pdf, accessed Jun 18, 2018.