From Data to Information

Published on: 
Pharmaceutical Technology, Pharmaceutical Technology-08-01-2019, Volume 2019 Supplement, Issue 4
Pages: s12–s15

Making siloed data accessible across functions and to contract partners is the first step to facilitating continuous improvement and enabling use of artificial intelligence in manufacturing.

Biopharmaceutical manufacturers have often described operations as data rich but information poor. While advanced analytics and sensors collect more data than ever before, much of it may not be used or shared with the operations that need it most to prevent lost batches and quality or compliance problems. The rise in outsourcing has only intensified the challenge.

IDC Health Insights surveyed 126 biopharmaceutical and pharmaceutical executives in the United States and the United Kingdom and found a significant gap between their need and their strategies for harnessing data (1). More than 98% of respondents said that cross-functional data access was important or very important to their business strategies, and 94% described the ability to apply advanced analytics and/or artificial intelligence the same way. Respondents believed that data access would be crucial to improving overall quality and productivity as well as the return on investment of their R&D investments.

Lack of clear strategy

However, 51% of those surveyed said that they did not have a clear strategy in place to help them reach either of those goals, citing regulatory uncertainty, budget prioritization, and the need for more action from functional operational groups. As Kevin Julian, senior managing director in Accenture’s Life Sciences practice, the survey’s sponsor, commented, “Important insights that could lead to the discovery, development, and delivery of promising new treatments are too often trapped within the functional silos of ... biotechnology companies” (2).

While some pharmaceutical manufacturers are still using paper-based record systems, a growing number are digitizing processes and making more data accessible in the most accessible context. On a fundamental level, open control systems and a common data structure have helped allow this to happen, according to a recent report (3) from Rockwell Automation, resulting in distributed control systems (DCS) that enable integration using unmodified Ethernet, and allowing two-way communication between enterprise resource planning (ERP) and manufacturing execution systems (MES). Work is underway to improve collaboration in preclinical and quality labs, where good laboratory practices (GLPs), rather than good manufacturing practices (GMPs), drive operations (Sidebar, p. s14) Much progress is being made in the area of batch records, and ­expanding connections between electronic lab notebooks, laboratory information management systems, MES, and ERP to increase access to information. Augmented reality offers one way to do this at the basic data recording and recovery level.

In addition, according to Emerson Process Management’s consultant Johan Zebib, quality review management and review by exception are being used with MES (4). Data historians can also be developed as a point of access, a concept that Eli Lilly has leveraged with its contract manufacturers in medical devices (5). Vendors are offering tools that make this task easier, with applications that use machine learning and other facets of artificial intelligence, allowing users to make ­connections between data points that might otherwise have seemed unrelated.

Augmented batch records

Apprentice.io has developed augmented reality and database applications that allow users to get feedback as they perform their jobs and to compare equipment performance and different batch runs (6). Contract development and manufacturing organizations (CDMOs) are becoming a more important market for the technology, says CEO Angelo Stracquatanio, using it not only to share data, but for training and troubleshooting in real time. “There are so many silos within manufacturing, and data are not being leveraged at different levels,” he says.

In 2019, the company has made improvements to its augmented reality products with augmented batch recordkeeping products that extend batch connectivity to laboratory information systems (LIMS) and electronic lab notebooks (ELNs), allowing inputs to be captured for every batch. Users can analyze process data to compare batch runs and implement continuous improvement programs and analyze specific runs to isolate deviations, Stracquatanio says, leaving an audit train of data that can be mined to decrease variability.

A growing number of CDMOs are using Apprentice.io’s Tandem remote telepresence tool to collaborate with their clients. The Apprentice System also collects voice, picture, and other types of data to create a rich audit trail. “For example, for a single-use filter, one can scan its bar code. Users now know what filter that is and can create an audit trail for it … and put data together in real time, using a hierarchy of importance to present data in a way that doesn’t overwhelm the user,” says Stracquatanio.

Advertisement

 

 

Artificial intelligence

Use of advanced analytics and artificial intelligence (AI) is growing in the pharmaceutical industry (Sidebar, p. s15). However, many companies are still at the earliest stages of developing strategies for using it. So far, the technology is farther along in clinical trials and in discovery. Accenture launched INTIENT in May 2019 to focus on discovery, clinical, and pharmacovigilance applications. Amgen has been working with Tata Consultancy Services on a Holistic Lab digital platform using Dassault Systemes’ BIOVIA, for process development (7).

However, some vendors are focusing on pharmaceutical manufacturing applications. Quartic.ai, for example, has launched an AI-driven platform to provide feedback to operators and to monitor and improve processes (8). The platform, which includes a data engine, designed to extract data from DCS, quality management systems (QMS), and data historians, as well as a connector that allows disparate software systems to communicate with each other, was designed to be integrated into existing plants and equipment, but Quartic is also working with a pharma company to embed the platform into a new facility.

Quartic.ai cofounder and CEO Rajiv Anand has an extensive automation and reliability background and previously worked at Emerson. The company’s management team members all come from pharmaceutical and automation backgrounds. “We didn’t want pharma users to feel that they needed to be coders or data scientists,” says vice-president of life sciences Larry Taber.

Levels of digital maturity

The platform is geared to the fact that every potential user will have a different level of digital maturity, says Anand. Once legacy data sources have been connected, the artificial intelligence engine can be used to solve a specific problem (e.g., monitoring an asset’s performance for deviations), he adds. Some clients are using it for complex predictive work.

The company has used its platform in a number of situations, including an effort to monitor and improve fermentation yield in a highly variable process where all critical quality attributes were under control. Quartic extracted data and identified a few key batches, Anand explains, and then built an algorithm to study relationships between the batches, clarifying eight years’ worth of data and fingerprinting each phase of the process. Ultimately previously unknown sources of variation were discovered. Work will now focus on learning more about them. The company has also done work with pharmaceutical companies in the area of predictive maintenance. Anand recalls one project designed to baseline the performance of an autoclave.

The equipment was modeled and industrial Internet of Things (IIoT) sensors used to get additional vibrational and ultrasound information. Once deployed, machine learning models could then predict potential failures with the ability to trace the source of the failure down to an individual component (i.e., a damaged valve).

Predicting quality events

There are many other applications where artificial intelligence could be applied to pharmaceutical manufacturing operations, particularly in visualizing end-to-end processes. As data analyst Jonathan Lowe (9) found at one company, more than 100 quality events were being investigated at any one time, stretching staff capacity. A machine learning model was designed to predict which events would take the longest to resolve. The model predicted more than 85% of the severe delays weeks before they happened, allowing quality managers to prioritize tasks more equitably.

References

1. Accenture, “The Case for Connectivity in the Life Sciences,” Infographic, accenture.com, May 22, 2019.
2. Accenture, “Accenture Introduces Intient,” Press Release, May 16, 2019.
3. B. Sisk, “Analytics Informs Decisions in the Facility of the Future,” Whitepaper, Rockwell, July 2019.
4. J. Zebib, “Review by Exception: Connecting the Dots,” Whitepaper, Emerson Process Management, July 2019.
5. D. Berg, et al., “Data Sharing in a Contract Manufacturing Environment,” Presentation at the OSI Soft Users Conference, 2017.
6. A. Shanley, Pharm Tech Bio/Pharma Laboratory Best Practices eBook 2018 (2) 8–9, November 2018.
7. A. Louie, “Empowering 21st Century Science in Life Sciences,” an IDC Perspectives Paper, idc.com, May 2018.
8. F. Mirasol, “Quartic Introduces AI Platform for Process Manufacturing Operations at INTERPHEX 2019, biopharminternational.com, April 4, 2019.
9. J W. Lowe, “Why Biopharma Manufacturing Needs to Leverage Network Optimization Techniques via Data Science,” medium.com, June 19, 2018.

Article Details

Pharmaceutical Technology
Supplement: Outsourcing Resources
August 2019
Pages: s12–s15

Citation

When referring to this article, please cite it as A. Shanley, “From Data to Information,"Pharmaceutical Technology Outsourcing Resources Supplement (August 2019).