OR WAIT null SECS
Advanced analytics and modeling can be used to predict downstream failures, allowing for corrective action before batches are lost.
In the biopharmaceutical industry, quality and consistency are two of the most important attributes for any manufacturing process. In downstream bioprocessing, quality is dictated by separation purification unit operations such as filtration and chromatography.
Chromatography, the key purification unit operation in biologics synthesis, requires precise monitoring of column integrity and efficiency. Chromatography columns consist of packed resin or media that separate the solution’s components based on chemical or physical properties such as size, charge, hydrophobicity, or affinity. As the chromatography column is cycled, the degradation of the resin ligand or fouling of the column can decrease the consistency and effectiveness of the purification process.
Transition analysis is commonly used to evaluate the condition, degradation, and efficacy of the chromatography column (1–3). Step change transitions in the column input solution, measured by conductivity or ultraviolet (UV) detection, are evaluated and reported as key performance indicators (KPIs). Typically, the height equivalent of a theoretical plate (HETP) and asymmetry (1,2) are the KPIs most frequently used to characterize chromatography column performance.
HETP defines the separation efficiency of the chromatography column, while asymmetry evaluates the normality of each peak to indicate the amount of peak fronting or tailing, either of which can result in reduced product quality and purity. These KPIs are helpful for monitoring column integrity and efficiency because higher HETP and abnormal asymmetry values, deviations from a value of 1.0, indicate resin degradation and signal potential batch failure. This article describes how data analytics can be used to cleanse input data, conduct transition analysis, create online dashboards, and develop predictive models to trend these KPIs over time. It also highlights alternatives to HETP and asymmetry that can be used to measure the separation efficiency and the normality or tailing of peaks (3).
There are a number of challenges involved with developing transition analysis calculations, however. One problem is data quality. Transition analysis measures subtle changes in data trends. To identify and quantify these changes, the data must be collected at a sufficient frequency and the sensors must be accurately calibrated.
Process data are typically linearly interpolated between stored values, which does not accurately represent a transient state where the step change occurs, particularly when there is additional noise in the data. Calculating accurate KPIs often requires filtering algorithms to more precisely capture the transient shape of the data. In addition, it can be difficult to work with the complex differential equations needed for HETP calculation because they require moment analysis.
Moment analysis requires solving a nested set of equations and relies not only on the column data, but also on the added contextualization of the time period where the transition occurs. Identifying the transition periods automatically from continuously flowing time-series data to perform differential calculations across those time periods is challenging.
Furthermore, a considerable amount of time (i.e., many hours or days) is typically needed to conduct the analysis and alert operators of imminent column failure. Traditionally, these complex calculations have been performed offline using mathematical software, which often results in excessive time to insight. As a result, operators must often react to column failure after it has happened, rather than being able to monitor columns and predict failure before it takes place.
Process analytics can be used to speed up the transition analysis process by allowing operators to connect directly to process data and perform calculations in real time. These data are typically stored in a process historian or an SQL database. Advanced analytics applications can set up live connections to data historians or SQL databases to view data and perform calculations like transition analysis as soon as the data are stored in the database. Live connections to the data enable the application to automatically find transition periods on set criteria, such as a shape or value, and to set KPI calculations that are executed upon completion of the transition period.
A number of pharmaceutical companies have used advanced analytics applications to remove outliers and cleanse conductivity data prior to transition analysis, dramatically improving the effectiveness of HETP trends over time by ensuring data quality. Data must be collected at a sufficient frequency and may require cleansing to remove outliers (e.g., signals generated when a sensor is not in use), or filtering to smooth out any noise in the signal. Transition analysis requires differential equations to quantify the change in conductivity with respect to volume over the entirety of the transition period.
As a result, HETP and asymmetry results can vary significantly depending on the frequency of the data and the type of interpolation used between data points. If data are not handled properly, false column failure alerts may be generated, or actual column failures may be missed.
Chromatography column data requires cleansing prior to transition analysis calculations to remove outliers, focus calculations on only the transition time periods, and smooth the data. While numerous filtering algorithms exist, selecting an appropriate filter that accurately identifies and captures step changes, such as one using the Loess method, is important for transition periods.
These techniques make HETP calculations more consistent by isolating relevant data and removing noise, enabling engineers and scientists to increase the precision and rigor of transition analysis calculations, focusing calculations on only the transition time periods and smoothing the data. These three techniques make HETP calculations more consistent by isolating relevant data and removing noise, enabling engineers and scientists to increase the precision and rigor of transition analysis calculations.
The KPIs monitored in transition analysis are expected to have low variation while the column is intact. But noise or inappropriate data sampling frequency can result in significant changes in the transition analysis KPIs that may falsely trigger column failure alerts. Data cleansing is used to enable effective monitoring of column health by reducing the dependence of the KPIs on the data sampling parameters.
Transition analysis is most effective when calculations are performed automatically, online to minimize delay between data collection and access to calculation results. Results can then be shared with operators, who can prescribe actions such as regenerating the resin or repacking the column when the HETP or asymmetry values are out of specification to avoid quality deviations and lost batches. Using an advanced analytics application, these goals can be achieved by performing the transition analysis calculations and displaying the results in an online dashboard (Figure 1).
First, after data have been cleansed, context is added by identifying the phase transitions, using a manufacturing execution system (MES) or through analytical techniques to detect all similar data profiles in the conductivity signal. Some chromatography equipment, such as in multicolumn continuous chromatography, contains data from multiple columns. Changes in other signals, such as the differential pressures across each column, can be used by advanced analytics applications to associate transitions with each respective chromatography column.
A similar approach can be used for single column chromatography if the column is cycled numerous times. Both HETP and asymmetry calculations can be performed using the transition periods for each column. HETP is often calculated using moment analysis (3) to describe the change in conductivity over column volume during each transition period. Asymmetry is estimated by comparing the change in column volume between left and right sides of the conductivity transition peak. These calculations can be performed using a formula in an advanced analytics application.
While transition analysis can be performed in other calculation programs, the complexity of the calculations often results in significant delays between when data are generated and when results are found. In addition, the calculation process entails extracting data from historians, inserting the data into a calculation program, and communicating the results as separate steps. Any delays in this process can result in missed column failures, leading to lost or reworked batches, reducing product yield and resulting in millions of dollars in lost product. Advanced analytics applications streamline this workflow by connecting directly to data in leading historians and other databases, performing the calculations automatically as new data are collected, and communicating results through auto-updating dashboards.
Pharmaceutical companies have utilized advanced analytics applications to monitor chromatography column health by creating online production dashboards to monitor HETP and asymmetry. As a result, these companies have realized savings of up to 10 hours per week per unit when performing transition analysis calculations, with further financial benefits gained by applying predictive maintenance to limit unplanned downtime and subsequent decreases in batches produced. Tracking HETP and asymmetry over time or batches enables corrective action prior to column failure.
Advanced analytics applications enable subject matter experts to access process data, define the equations for transition analysis, and build models to forecast predicted HETP and asymmetry values. Predictive maintenance models of time series data utilize a time component to predict usage of the equipment. With transition analysis, an equation can be written to count the total run time of column usage since the last resin regeneration, and to then extrapolate that run time data into the future based on current utilization using formulae in an advanced analytics application. Multivariate regression, such as principal component analysis, enables the extrapolation of the regressed model to predict future values or predict equipment maintenance periods. With transition analysis, these models help users predict HETP and asymmetry values based on the run time of the current utilization of the chromatography columns and other process variables such as flow rates or concentrations. Predictive models can be used to forecast an appropriate maintenance window prior to column failure. Performing online transition analysis with predictive modeling can thus reduce downtime, product quality deviations, and lost batches.
Alternatively, scientists and engineers can use alternatives to transition analysis such as TransWidth and DirectAf (3). Trans Width is a measure of the separation efficiency or resolution of the peak, which is calculated from the change in volume across the transition period. DirectAf indicates the normality or tailing of the peak, which is calculated as the average volume differentials by comparing the beginning and end of the transition period at six points for each transition, and it can be used in place of asymmetry.
TransWidth and DirectAf have been shown to be less influenced by noise in the data and thus may be more robust metrics for detecting column integrity and efficiency. These techniques have allowed Just–Evotec Biologics to detect resin degradation and replace the resin prior to subsequent batches.
1. B. McCoy and M. Goto, Chem. Eng. Sci. 49 (14) 2351-2357 (1994).
2. T.M. Larson, J. Davis, H. Lam, et al., Biotechnology Progress. 19. 485-92. 10.1021/bp025639g (2003).
3. Y. Cui, Z. Huang, and J. Prior, BioPharm International 31 (1) (2018).
Vol. 44, No. 4
When referring to this article, please cite as J. Reckamp, "Using Online Transition Analytics to Predict Chromatography Column Failure," Pharmaceutical Technology 44(4), 2020, pp. 38-41.