Complex Biomolecules Require Analytical Evolution

Published on: 
Pharmaceutical Technology, Pharmaceutical Technology, July 2021 Issue, Volume 45, Issue 7
Pages: 16–24

The ever-increasing complexity of biotherapeutic molecules presents unique analytical challenges for developers.

The structure and activity of complex biotherapeutic molecules require sensitive analytical technologies that can characterize these complicated protein structures and observe their pharmacokinetic activity, elements fundamental to biotherapeutic drug development and production. Relevant technologies are needed to make detailed analytical assessments of structure and function, and further innovation is needed to address the continued challenge faced by the industry.

The problem with complex biological molecules

The structure of biological molecules poses interesting challenges to analytical assessment. Elucidating the purity or structure of a complex biological product often requires methods designed uniquely to the analyte, multiple analytical assays that leverage cross-modal technologies, and orthogonal analytics to differentiate the analyte from a biosimilar molecule, says William Bakewell, research fellow at PPD Laboratories’ GMP Lab.

A fundamental challenge, says Shawn Fitzgibbons, manager, Catalent Biologics, is the large size of biologics materials, and, in turn, the micro-heterogeneity observed with respect to each attribute measured. “Every molecule will have unique characteristics, for example, although characterization of post-translational modifications is routinely addressed, there may be a specific challenge if the molecule has a high level of modifiable residues at, or near, important binding regions of the molecule, or if the molecule is particularly prone to these modifications because of its given formulation,” he says.

Challenges with respect to functional analyses often revolve around proper, timely, development of the necessary cells and antibodies, Fitzgibbons adds. Similarly, extractables and leachables assessments require a large panel of extraction and analytical techniques to be undertaken to assure detection, identification, and quantitation of a potential toxic, unknown material.

Meanwhile, structure–function assessments have been known to be challenging because the amount of material available to address all orthogonal methods is often limited, says Pedro Morales, director, Catalent Biologics. “For example, it becomes challenging to isolate and analyze variants that are present in only small amounts within the fractionated sample,” he says.

Morales also explains that the extraction and/or purification of variants needs to be done while avoiding the introduction of additional—or exacerbation of—post-translational modifications (PTMs) or impacting the material so as to create aggregates. He says that experimental design should include appropriate controls needed for analysis in parallel to the analyte(s) of interest. “It is always challenging to develop a robust functional assay that will ensure a solid structure–function correlation that can be unequivocally determined,” he states.

Jana Hersch, scientific consultant, Genedata, adds that correct prediction of how a complex biological will behave in vivo remains the biggest challenge during the development of complex biotherapeutic molecules. She points out that antibody-drug conjugates (ADCs), bispecifics, and even engineered cell-surface receptors, such as chimeric antigen receptors (CARs), rely on the scientists’ ability to assess how these molecules will behave once inside the patient.

“It is important to predict not just whether they bind with the desired affinity and specificity to their target in vitro, but also that they bind in the right tissues, or that they have the appropriate level of clearance from the body to achieve the desired therapeutic effect and cause minimal harm,” Hersch says. She also explains that many different versions of a complex biological molecule are often tested in iterative pharmacokinetic and pharmacodynamic analyses before the preferred format as well as the final formulation are selected.

Establishing analytical studies

Campbell Bunce, chief scientific officer at Abzena, emphasizes the point that all drug developers should look to establish the right set of analytical methods and workflows to assess the developability of their individual drug candidate and ensure that those they take forward have a reduced risk of inherent liabilities at an early stage. He states a holistic approach that applies design of experiment principles is recommended to capture a wide range of parameters to focus across several characteristics, such as function, manufacturability, immunogenicity, and safety. “For example, through developability assessment, methods can be applied that will provide insights into the relative stability of the biological molecules and allow them to be ranked against comparators to help select the best lead candidate for development,” he says.

Bunce notes that often, novel biotherapeutic candidates are unique with little information that can be drawn on to help develop the right assays and comparative assessment parameters, as can be done with antibodies, where a significant amount of historical data exists. Certain aspects of novel biologic molecules can be pieced together, however, directed by experience-led trial and error to generate developability profiles that lead to better outcomes, he explains.

“There may be occasions where liabilities cannot be designed out using protein engineering techniques and thus, need to be managed through different approaches. Consequently, other mitigating options can be evaluated and applied,” Bunce states. Biologics, for example, may be affected by one or more degradation pathways that can be evaluated and addressed through the formulation development process to maximize the stability of the biologic drug candidate.

Because of the complexity of biological molecules, a panel of analytical methods is used to analyze the therapeutic proteins for lot release to ensure consistent product quality, safety, and efficacy, adds Gang Huang, senior vice-president, analytical sciences and clinical quality control, WuXi Biologics. Though enormous work is done in the clone selection and process development stages, it is still expected that a final drug product is composed of a mixture of hundreds to thousands of variants that differ in PTMs and higher order structure, he emphasizes. As a result, state-of-the-art analytical methods have been used to thoroughly characterize various PTMs and degradation pathways of these proteins.

Bakewell notes that mass spectrometry (MS)-based analyses are a critical component of determining primary structure for conjugated proteins, monoclonal antibodies (mAbs), and cell and gene therapy (CGT) products. He further explains that intact mass and peptide mapping-based MS methods allow for confirmation of protein sequence as well as identification of any disulfide bonds, oxidation/deamidation events, glycosylation patterns, and PTMs, all of which are key structural critical quality attributes (CQA). Meanwhile, information about secondary and tertiary structure can be determined through techniques such as circular dichroism, isothermal calorimetry, size exclusion chromatography, and multi-angle light scattering.

“Complementary to these protein-level analyses, sequencing of the products underlying DNA can be performed by Sanger sequencing, capillary electrophoresis with fragment analysis, and next-generation sequencing (NGS) approaches. The depth to which a product is sequenced—including a product’s expressing and non-expressing elements—and the degree to which sequencing results confirm a product’s identity has yet to be standardized,” Bakewell adds.

Lessons from mAbs

Fortunately, biopharmaceutical analysis methods have improved since the first mAbs were approved. The most significant change over the past 15 years has been the speed and resolution at which bioanalytical tools are employed, says Christopher Colangelo, biopharma business development leader, Agilent Technologies. The development of porous and/or sub-2-um chromatographic media coupled to ultra-high-performance liquid chromatography (UHPLC), for example, has enabled analytical method runtime to decrease 10-fold from 30–50 mins/sample to 3–5 mins/sample. Colangelo adds that a key success in many biopharma research labs today is the use of automation solutions in lab workflows. “For example, successful labs effectively use robotic sample preparation, liquid autosamplers, and automated data analysis tools,” he states.

Significant improvements have also been made in the sensitivity, resolution, accuracy, reproducibility, and specificity of analytical methods used to support the release and stability testing of biological drugs, Bakewell emphasizes. Sodium dodecyl sulfate polyacrylamide gel electrophoresis and isoelectric focusing gel electrophoretic methods commonly used for release and stability in the 1980s have been largely replaced by capillary electrophoresis (CE) methods, for example, Bakewell says. “CE methods are deemed superior to traditional gel electrophoresis methods due to their improved resolution and their high degree of reproducibility and robustness,” he states.

Advertisement

Similarly, column chromatographic separation techniques have evolved and improved over time to where, for example, the operating pressures that analytical HPLC/ultra high (UH)PLC columns can tolerate has increased. This has resulted in a concomitant increase in analytical resolution, Bakewell explains. Furthermore, improvements in the quality of HPLC systems and batch-to-batch reproducibility of HPLC columns has led to better and more reproducible analytical methods, while data analysis for chromatographic separations has evolved from using paper integrators to Code of Federal Regulations (CFR) Title 21, Part 11-compliant analytical software, says Bakewell.

“Over time, the variety of good manufacturing practice (GMP)-compliant analytical methods used to evaluate purity, potency, and stability of biopharmaceuticals has expanded and evolved. These now include multiplex and fluorescent plate reader assays, residual assays for process impurities such as host cell proteins and DNA, and in-vitro potency assays rather than animal models,” Bakewell adds. “Improvements in automation that remove analyst subjectivity for compendia methods such as color, clarity, particulate matter, sterility, and endotoxin analysis have increased the reliability of these methods.”

More recently, Hersch includes, advanced protein engineering has given rise to synthetic antibody-like molecules, including bispecifics, ADCs, or CARs with an antibody-derived targeting domain. “The analytical methods associated with modern protein design are now nearly always high-throughput and require automation, with researchers routinely screening large panels of molecules in different formats to select the very best candidates for further development. A large amount of data is generated at each step and using sophisticated data management solutions that can handle and structure all the associated data centrally across an organization have become the norm in the industry,” Hersch says.

Morales adds that technology, software, and biological knowledge have evolved in such a way that cell-based potency assays are expected to be more robust and precise by default. “Phase-appropriate validation enables the identification of critical reagents and steps, not only for analytical methods, but also from manufacturing process validations. Cell-based development reports are now an expectation to demonstrate that efforts were made to optimize assay precision,” he states.

Fitzgibbons further adds that advances in characterization have occurred within all areas, from improvements in cell-based functional analysis to detection mechanisms in MS-based solutions involving higher-resolution instrumentation. “In several instances, such as in mass spectrometry, data processing solutions have driven evolution where the large amount of data acquired has required powerful software to process the volume of information present in the analysis,” he says.

One hot analytical topic for biopharmaceuticals over the past few years has been sub-visible particle analysis, including particle counts using microflow imaging analysis and high accuracy products liquid particle counters, says Gary Watts, Analytics Manager at Abzena. “The understanding that ‘particles breed particles’ potentially leading to drug products failing specifications has led to emphasis on early detection to help de-risk the drug development pathway by identifying and removing the likelihood of any issues further down the line,” he explains.

Huang reinforces how analytical advancement has made significant strides for the sake of biological drug development since the 1980s, noting that advances in liquid chromatography (LC) and CE have had significant impact on biopharmaceutical development by improving method robustness and throughput.

“There was a time when scientists had to identify all peaks in a peptide map using Edman degradation—a tedious task with limited sensitivity and robustness—but now the peptide sequence can be easily elucidated via LC–MS. Beyond this, peptide-based multi-attribute method (MAM) as an emerging technology has the potential to offer enhanced sensitivity and selectivity for the simultaneous detection and quantification of multiple product attributes, even in the quality control (QC) laboratory,” Huang says.

For binding assays, Huang says, surface plasmon resonance and biolayer interferometry technologies have become increasingly promising compared with the conventional enzyme-linked immunosorbent assay binding method and have been used in both biologics development and quality control labs.

What the industry still needs

Moving forward, the increasing complexity of biologic modalities in drug development may also lead to analytical needs not yet met by current methods. “The field of complex biologicals is diverse and expanding rapidly,” says Hersch. “Standardization of analytical requirements for full characterization and quality control of these new modalities is, in some cases, still evolving and represents an important unmet need.”

According to Hersch, unmet need in the industry includes quality control analyses for the newest autologous cell therapy approaches, which must be performed quickly to ensure the shortest possible vein-to-vein time for critically ill patients. “To meet this need, researchers may have to rely more on faster NGS-based methods rather than more time-consuming traditional in-vivo procedures,” she envisions.

In addition, as biologic modalities become increasingly complex (e.g., bispecific antibodies, gene therapies, cell therapies), there will be a continued need to develop methods for characterization of the active biologics’ components, says Fitzgibbons. This development may involve analytical techniques that are only just now becoming widely used. “As new higher-resolution analytical techniques and more hybridization occurs within instrumentation, new opportunities for characterization present themselves,” he states.

One example of instrument hybridization involves the combination of CE techniques with MS, Fitzgibbons says. With hybridization such as this, identification of size or charge heterogeneity profiles becomes much more straightforward, he notes. He also says that analytical instrumentation that automates processes that were previously undertaken manually offers higher throughput and opens the way for new regulatory requirements for analytical characterization. “Additionally, as the development in viral vector manufacturing progresses, and as new gene therapy offerings are developed, new analytical challenges arise to assure CQAs are understood and controlled,” he adds.

Meanwhile, on-line/real-time monitoring of product attributes has gained increasing interest due to advancement in continuous processing for biologics manufacturing in recent years, says Huang. He also adds that a greater level of automation in sample preparation as well as applying MS techniques, such as peptide-based MAM, which has been used in quality control to simultaneously monitor different product attributes, can greatly enhance real-time monitoring and expedite biologics development.

“One gap that needs addressing is industry coalescence in developing standardized methods and reagents,” adds Colangelo. He points out that many manufacturers still develop fit-for-purpose methods for each biotherapeutic, which leads to a continual reinvention of the analytical methods for every new drug product.

Bunce points out that capacity is a major issue because all providers are looking to access platforms that can express more material in less volume, while maintaining the sensitivity of the process at scale. “Consequently, the industry needs to ensure that the analytical tools and methods are fit-for-purpose and will robustly inform the maintenance of the overall integrity and stability of the biopharmaceutical,” he says.

Bunce also stresses that it is imperative to get the analytical strategy and methodologies right in the early stages to support confidence in the results and mitigate issues with quality downstream. “Paying attention early on is crucial to long term success of the biologic and mitigates risk of correcting issues at later stages of development, which may have an impact both in terms of economics and safety to the patients. Getting things right up front is a mindset to be encouraged in analytical method development to characterize biopharmaceuticals and reduce risk in the drugs that we develop,” he asserts.

Marc Wolman, principal scientist at PPD Laboratories’ GMP Lab, meanwhile, explains that the biopharmaceutical industry has made significant strides in managing logistics and working within critical timelines, which has been especially and recently evidenced by the successful development, production, analyzation, and review of new pharmaceutical products in record time to battle COVID-19. “However, we also saw that drug sponsors, raw material and testing reagent suppliers, CDMOs [contract development and manufacturing organizations], CROs [contract research organizations], and regulatory agencies have the opportunity to implement faster, more-efficient means to develop, produce, analyze and review new pharmaceutical products, all while maintaining a critical eye on safety and quality,” he states.

For analysis of biotherapeutic product CQAs, the advent of rapid microbiological methods, rapid high-throughput screening methods, rapid release and stability methods—in particular rapid bioassay methods that mimic the in-vivo functionality of the product—are all areas where enhanced turnaround times will be critical to getting new products to market, Wolman emphasizes.

“The ability of the industry to meet rapid manufacturing and testing turnaround times will define the success and availability of promising autologous cell-based therapies,” Wolman adds. For example, to support an autologous cell therapy, a patient’s cells must be harvested and modified, and the modified cells must be rapidly tested for structure, function, identity, purity, and safety-related CQAs before the patient can be infused with the modified cells. “The analytical testing requires coordination of sample shipping and receipt with critical handling, execution of assays and data review, and quality assurance approval of the samples’ testing report,” Wolman explains. “This monumental effort may have a limited timeline of one to two weeks given the short shelf life of the ex-vivo modified cells and the patient’s life-depending need to receive the therapeutic.”

About the author

Feliza Mirasol is the science editor for Pharmaceutical Technology and Pharmaceutical Technology Europe.

Article Details

Pharmaceutical Technology
Vol. 45, No. 7
July 2021
Pages: 16–24

Citation

When referring to this article, please cite it as F. Mirasol, “Complex Biomolecules Require Analytical Evolution” Pharmaceutical Technology, 45 (7) 16–24 (2021).