Translating Rich Data to Useful Information

July 2, 2019
Agnes Shanley

Agnes Shanley is senior editor of BioPharm International.

Pharmaceutical Technology's In the Lab eNewsletter

In the Lab eNewsletter, Pharmaceutical Technology\'s In the Lab eNewsletter-07-02-2019, Volume 14, Issue 7

The means to continually improving and enabling the use of artificial intelligence in manufacturing lies in the ability to make data accessible across functions and to contract partners.

Biopharmaceutical manufacturers have often described operations as data rich but information poor. While advanced analytics and sensors collect more data than ever before, much of it may not be used or shared with the operations that need it most to prevent lost batches and quality or compliance problems. The rise in outsourcing has only intensified the challenge. 

IDC Health Insights surveyed 126 biopharmaceutical and pharmaceutical executives in the United States and the United Kingdom and found a significant gap between their need and their strategies for harnessing data (1). More than 98% of respondents said that cross-functional data access was important or very important to their business strategies, and 94% described the ability to apply advanced analytics and/or artificial intelligence the same way. Respondents believed that data access would be crucial to improving overall quality and productivity as well as the return on investment of their R&D investments.

However, 51% of those surveyed said that they did not have a clear strategy in place to help them reach either of those goals, citing regulatory uncertainty, budget prioritization, and the need for more action from functional operational groups. As Kevin Julian, senior managing director in Accenture’s Life Sciences practice, the survey’s sponsor, commented, “Important insights that could lead to the discovery, development, and delivery of promising new treatments are too often trapped within the functional silos of ... biotechnology companies” (2). 

While some pharmaceutical manufacturers are still using paper-based record systems, a growing number are digitizing processes and making more data accessible in the right context. On a fundamental level, open control systems and a common data structure have helped allow this to happen, according to Rockwell Automation, resulting in distributed control systems (DCS) that enable integration using unmodified Ethernet, and allowing two-way communication between enterprise resource planning (ERP) and manufacturing execution systems (MES) (3).

Work is underway to improve collaboration in preclinical and quality labs, where good laboratory practices (GLPs), rather than good manufacturing practices (GMPs), drive operations. Much progress is being made in the area of batch records, and expanding connections between electronic lab notebooks, laboratory information management systems, MES, and ERP to increase access to information. Augmented reality offers one way to do this at the basic data recording and recovery level. 

In addition, according to Emerson Process Management’s consultant Johan Zebib, quality review management and review by exception are being used with MES (4). Data historians can also be developed as a point of access, a concept that Eli Lilly has leveraged with its contract manufacturers in medical devices (5). Vendors are offering tools that make this task easier, with applications that use machine learning and other facets of artificial intelligence, allowing users to make connections between data points that might otherwise have seemed unrelated.

Augmented batch records has developed augmented reality and database applications that allow users to get feedback as they perform their jobs and to compare equipment performance and different batch runs (6). Contract development and manufacturing organizations (CDMOs) are becoming a more important market for the technology, says CEO Angelo Stracquatanio, using it not only to share data, but for training and troubleshooting in real time. “There are so many silos within manufacturing, and data are not being leveraged at different levels,” he says.

In 2019, the company has made improvements to its augmented reality products with augmented batch recordkeeping products that extend batch connectivity to laboratory information systems (LIMS) and electronic lab notebooks (ELNs), allowing inputs to be captured for every batch. Users can analyze process data to compare batch runs and implement continuous improvement programs and analyze specific runs to isolate deviations, Stracquatanio says, leaving an audit train of data that can be mined to decrease variability.

A growing number of CDMOs are using’s Tandem remote telepresence tool to collaborate with their clients. The Apprentice System also collects voice, picture, and other types of data to create a rich audit trail. “For example, for a single-use filter, one can scan its bar code. Users now know what filter that is and can create an audit trail for it … and put data together in real time, using a hierarchy of importance to present data in a way that doesn’t overwhelm the user.” says Stracquatanio.

Artificial intelligence

Machine learning and artificial intelligence are also moving into pharmaceutical manufacturing applications. The technology is farther along in clinical trials and in discovery. Accenture launched INTIENT in May 2019 to focus on discovery, clinical, and pharmacovigilance applications. Amgen has been working with Tata Consultancy Services on a Holistic Lab digital platform using Dassault Systemes’ BIOVIA, for process development (7). 

However, some vendors are focusing on pharmaceutical manufacturing., for example, has launched an AI-driven platform to provide feedback to operators and to monitor and improve processes (8). The platform, which includes a data engine, designed to extract data from DCS, quality management systems (QMS), and data historians, as well as a connector that allows disparate software systems to communicate with each other, was designed to be integrated into existing plants and equipment, but Quartic is also working with a pharma company to embed the platform into a new facility. cofounder and CEO Rajiv Anand has an extensive automation and reliability background and previously worked at Emerson.The company’s management team members all come from pharmaceutical and automation backgrounds. “We didn’t want pharma users to feel that they needed to be coders or data scientists,” says vice-president of life sciences Larry Taber. 


Levels of digital maturity

The platform is geared to the fact that every potential user will have a different level of digital maturity, says Anand. Once legacy data sources have been connected, the artificial intelligence engine can be used to solve a specific problem (e.g., monitoring an asset’s performance for deviations), he adds. Some clients are using it for complex predictive work (Photo, above, shows the platform in use at a biopharm facility).

The company has applied the platform in a number of situations, including an effort to monitor and improve fermentation yield in a highly variable process where all critical quality attributes were under control. Quartic extracted data and identified a few key batches, Anand explains, and then built an algorithm to study relationships between the batches, clarifying eight years’ worth of data and fingerprinting each phase of the process. Ultimately previously unknown sources of variation were discovered. Work will now focus on learning more about them.

The company has also done work with predictive maintenance. Anand recalls one project designed to baseline the performance of an autoclave. The equipment was modeled and industrial Internet of Things (IIoT) sensors used to get additional vibrational and ultrasound information. Once deployed, machine learning models could then predict potential failures with the ability to trace the source of the failure down to an individual component (i.e., a damaged valve).

There are many other applications where artificial intelligence could be applied to pharmaceutical manufacturing operations, particularly in visualizing end-to-end processes. As data analyst Jonathan Lowe (9) found at one company, more than 100 quality events were being investigated at any one time, stretching staff capacity. A machine learning model was designed to predict which events would take the longest to resolve. The model predicted more than 85% of the severe delays weeks before they happened, allowing quality managers to prioritize tasks more equitably. 


1. Accenture, “The Case for Connectivity in the Life Sciences,” Infographic,, May 22, 2019.
2. Accenture, “Accenture Introduces Intient,” Press Release,, May 16, 2019.
3. B. Sisk, “Analytics Informs Decisions in the Facility of the Future,” a Rockwell Automation white paper to be published in July 2019 on
4. J. Zebib, “Review by Exception: Connecting the Dots,” an Emerson Process Management white paper to be published in July 2019 on
5. D. Berg et al., “Data Sharing in a Contract Manufacturing Environment,” a presentation at the OSI Soft Users Conference, 2017,
6. A. Shanley, “Mixed Reality Gains a Foothold in the Lab,” Biopharm Lab Best Practices ebook issue 2, pp 8-9,, November 15, 2018.
7. A. Louie, “Empowering 21st Century Science in Life Sciences,” an IDC Perspectives Paper, Document VS43749317,, May 2018.
8. F. Mirasol, “Quartic Introduces AI Platform for Process Manufacturing Operations at INTERPHEX
9. J W. Lowe, “Why Biopharma Manufacturing Needs to Leverage Network Optimization Techniques via Data Science,”, June 19, 2018.