Optimizing Tech Transfer with Advanced Analytics

Published on: 
Pharmaceutical Technology, Pharmaceutical Technology-02-02-2021, Volume 45, Issue 2
Pages: 31–33

Sharing process insights across the stages of drug development improves tech transfer.

A pharmaceutical product can take many years to move from drug discovery to approval. Starting in research and development (R&D), significant amounts of data and process information are collected. This information is generated by many departments, each of which may operate in a silo. While the results of the process design are shared, often the underlying data are not, resulting in a loss of knowledge continuity and suboptimal tech transfer.

This problem and other related issues can be addressed through the intelligent application of advanced analytics software to coordinate data sharing and insights between teams and functional areas. This article will explore some examples of how such an analytics application can be used to support knowledge capture and collaboration, from the lab to the commercial manufacturing site.

Gaps in the flow of knowledge

As a drug moves into preclinical and clinical trials, the development team verifies that the process can scale to pilot and ultimately to commercial manufacturing, all without changes to critical process parameters that could impact product quality. Pilot-scale data are usually stored in a process historian and compared to laboratory and process analytical technology data, which are typically available as disparate data sources. As a result, it is often difficult to directly compare the pilot data to R&D studies. Disparate data sets must be extracted from their respective sources, then the data must be aligned before performing analysis. After this data wrangling, a retrospective analysis of scalability is performed.

Finally, if the product makes it through all phases of clinical trials and gets the stamp of approval from FDA or another agency, it enters commercial manufacturing, where the recipe and process setpoints are transferred to internal manufacturing sites or to external contract manufacturers. Failure modes for process safety and quality are evaluated using cause and effect analysis, but for the most part, the manufacturing site is starting from scratch to collect data for process efficiency improvements. In the event of a process deviation, R&D staff may be contacted for additional information on a particular failure mode, but commercial-scale manufacturers are largely unable to mine data or knowledge from the entire development process. By the time a manufacturing deviation occurs, R&D scientists may have moved on to other projects, so the process to get materials, set up analytical methods, and provide additional information is inefficient. There is also an opportunity loss when R&D personnel are not able to work on new products because they must investigate issues with older projects. Despite these difficulties, it is critical to find the root cause of a deviation to prevent repeat deviations.

Data connectivity, integrity, and auditing

To support analyses across all stages of drug development and manufacturing, subject matter experts (SMEs) must be able to easily access data across a product’s full lifecycle so laboratory, pilot, and commercial manufacturing can perform comparable analyses and utilize the knowledge captured in tech transfer. It is crucial to minimize time spent cleansing and aligning data sets so these experts can speed time to insights. By connecting these disparate data sets to an analytics application, scientists, developers, and manufacturers can quickly find, explore, quantify, and document results about their processes to support tech transfer.

Data connectivity alone, however, isn’t enough, as data integrity is also of prime importance to guarantee safe and effective drug production. Information security considerations for authentication and authorization are necessary to ensure only designated individuals can access the system and interact with the appropriate data.

Tracking changes to calculations created in an analytics application is another key factor for regulatory compliance. Data administrators must be able to provide evidence that data are used properly, especially when making decisions for releasing a production run or changing parameters during manufacturing.

To support these types of efforts, companies must establish a designated good manufacturing practice (GMP) computing environment to use for production decisions. This environment may be an entirely separate system with its own connections to the data sources. Or, it may exist within the system used for engineering, as long as the system has access control settings to limit user edits of validated data, along with the ability to keep an audit log of changes made to calculations and other analytics configurations. 

In either case, standard operating procedures and user permissions are leveraged to maintain the integrity of the GMP content and to provide SMEs with a different workspace in which to iterate through in-progress analyses. 

Therefore, advanced analytics applications must provide secure connectivity to live, streaming, validated data, with controls for traceability and auditing. Calculations must be transparent and reproducible to provide easy understanding of exactly how the underlying data were processed.

Preparing for commercial manufacturing

Advertisement

Advanced analytics applications improve knowledge capture to support the technology transfer process by making experimental information available to commercial manufacturing personnel and other departments. Knowledge transfer is maximized by connecting to data sources at R&D, pilot, and manufacturing scale to overlay experiments with batches, and by providing tools to capture in-depth process learnings during scale-up.

In an example of a continuous twin-screw granulation process, an advanced analytics application was used to analyze the data from a design of experiments and to build a quality-by-design (QbD) multivariate design space around the process for ultimate use in commercial manufacturing. This model was developed by cleansing the experimental data to align upstream and downstream process signals in time, and then limiting the model inputs to steady-state operation.

These inputs were used in a multivariate regression model to determine the influence on critical quality attributes. The multivariate QbD model was then deployed in commercial manufacturing to provide a monitoring view of the process, flagging deviations from the defined quality specifications as they occurred to allow quick remediation (see Figure 1).

Continued process verification through statistical control charting

By proactively monitoring manufacturing processes, pharmaceutical companies can control variation to ensure quality product. Statistical control charting is used to support continued process verification (CPV), ensuring that processes are executed in a correct and consistent manner.

Control limits will change based on which product recipe is being run at the manufacturing site. It is, therefore, important to identify both the parameter to monitor and the associated product campaigns being run so statistical values for averages and standard deviation limits can be calculated for each product, as shown, for example, in Figure 2. After creating the sigma limits, run rules can be applied to search for process excursions, and for trends that may provide early excursion warnings.

Once the logic is defined for the CPV control charts and desired run rules, it can be applied to any period of time, or even be performed online to track batch variations in near real-time.

Ongoing process monitoring

To support the pharmaceutical tech transfer process, SMEs must be able to connect to data from many different process, lab, maintenance, and manufacturing sources to perform analyses. Once these data are collected, engineers and scientists can calculate statistical limits, key process indicators, and aggregations using analytics tools. Important trends and metrics can then be assembled into static batch reports or live updating dashboards for ongoing process monitoring. These dashboards can continue to update and include the latest data for monitoring the process in near real-time.

These dashboards and reports promote knowledge capture and collaboration across organizations by bringing together multiple analyses, often configured by a team of process experts. These reports can be reviewed by operators, engineers, and managers to ensure each batch is progressing within the limits specified during the drug development and pilot scale-up processes. Using the digital tools that make data visible across the product lifecycle, any observed deviations can be analyzed against lab data to search for similar issues and learnings.

Conclusion

Throughout the product lifecycle, substantial knowledge is gathered about the process through R&D experiments and scale-up batches. Advanced analytics applications enable rapid and more effective tech transfer throughout the product lifecycle to shorten the time required for handoffs between departments within an organization. During tech transfer, advanced analytics applications can help researchers understand the relationships among variables and determine critical process parameters at laboratory scale, as well as verify scalability and optimize the process as it moves into pilot production. In addition, these learnings can prove critical for monitoring process variability and quality and for performing deviation investigations at commercial scale.

Advanced analytics applications thus empower SMEs to document their analyses, capture information, and mine previous information from colleagues to speed up the process development timeline and improve efficiency at commercial scale.

About the author

Emily Johnston is a senior analytics engineer with Seeq.

Article details

Pharmaceutical Technology
Vol. 45, No. 2
February 2021
Pages: 31–33

Citation

When citing this article, please refer to it as E. Johnston, "Optimizing Tech Transfer with Advanced Analytics," Pharmaceutical Technology 45 (2) 2021.