OR WAIT null SECS
Can macromolecular processes learn from small-molecule experience? Burdened by exploding bioreactor productivity, architects of downstream bioseparation technology are looking into the drug industry's past for inspiration, while small-molecule companies adopt techniques pioneered by biotechnology. (The first of three articles on the current state of separations.)
Small molecules dominate the pharmaceutical industry and account for the great majority of all drug-making purifications, yet research, new tools, and growth tend to focus on the much smaller macromolecular sector. At the same time, bioseparations specialists seek inspiration and desperately needed productivity improvements in small-molecule techniques long considered old hat.
"One of the trends we see is biotechnology moving towards smaller molecules, as people use the laboratory technologies of biotechnology to identify small molecules that can act as gene expression inhibitors or protein modifiers," says Jerold M. Martin, senior vice-president, global technical director, Pall Life Sciences (East Hills, NY, www.pall.com). "The long-term future of biotech may move back into small molecules and incorporate small-molecule technologies ... We're certainly keeping aware of things like chiral separations and are involved in membrane-based techniques for chiral separations, but most of that is the long-term picture. Most of our efforts right now are in the rapidly expanding part of the market opportunity: in large molecules."
Uwe Gottschalk, a Bayer biochemist for 13 years before becoming vice-president of purification technologies at Sartorius's bioseparation group (Göttingen, Germany, www.sartorius.com), is skeptical about the predicted shift back to small molecules. "Nature is very complex and you can't get away without large molecules. The chemists kept saying that we would enter a world of small molecules. And this has not materialized so far. You have to reflect on the complexity of nature, which is why you need antibodies and large molecules and you cannot miniaturize them."
Protein A bead.
"There's a double trend. We talk about certain drugs going to larger and larger capacity, but over the next five years we're going to start seeing more patient-specific therapies," says Martin. "You're going to have drugs that need to be made in multiple smaller batches with slight modifications." For example, the performance of Genentech's (South San Francisco, CA, www.genentech.com) breast cancer treatment, "Herceptin," varies according to the patient's genetic profile. Over time, says Martin, more of the industry will move toward multiple drugs for molecular variants within an indication. Ultimately, some individually tailored biological treatments will require minuscule batches for treating a single patient.
The productivity challenge
Over the past two decades, bioreactor productivity has increased by an order of magnitude.
"When I started in biotech 20 years ago," says Martin, "we were talking about mammalian cell cultures densities in the 1–2 × 106 per milliliter range. Now people are talking about working in the 7–8 × 106 and even 107 per milliliter range."
Denser cultures translate into more protein product—and more separation challenges. In the course of just a few years, says Richard Pearce, director of purification solutions at Millipore (Bedford, MA, www.millipore.com), bioprocesses "have gone from less than a gram per liter in some of the drugs that are on the market now, to five grams in some of the newer processes, and some people are talking about producing as much as 10 grams." To achieve these densities, process designers are using more complex media and keeping cells viable for longer periods.
Oxygen filtration filter fire hazard
The result, says Pearce, is "a much more concentrated, messy soup. Now the separation challenge has shifted to increasing the capacity and efficiency of the downstream processing area. You have a lot of cell debris, but you also start to have concentrations of proteins high enough to get aggregates. And as they push the cells more, they get more molecule-related impurities. The big challenge used to be removal of virus and DNA and host-cell proteins from the protein of choice. Now we have to remove monomers, dimers, aggregates, and some clipped antibodies. Cells are producing a lot more protein, but they're also producing more and different impurities."
Cost-of-goods does count. In maturing, the pharmaceutical biotechnology industry faces the same increasing pressure on production costs as the rest of the pharmaceutical industry: imminent generic competition, increasing numbers of alternative therapies, and the scrutiny of an increasingly restive public.
Figure 1: The productivity explosion will force process designers to choose between high-volume, low-resolution steps (like the industrial LC column being packed here) and more expensive and complex high-resolution alternatives.
"The 'good old days' in the industry sector are over," says Gottschalk. "Now it's not only time that counts, it's also much more serious about money. The grace period is over, and we have to deliver on the promises." The "upstream guys" have made huge productivity gains, greatly reducing earlier concerns about an industry crisis in bioreactor capacity. Now the "downstream guys" are under the microscope, and performance is measured in cost per gram. The downstream groups used to account for about half of overall production cost per gram, Gottschalk says. Now, with upstream efficiencies and yield increases, downstream processes account for 70% or more of total production costs.
Some of the costs are breathtaking. At around $10,000 per liter, packing an industrial-size resin column with a Protein A sorbent (for capturing monoclonal antibodies or certain proteins) can cost several million dollars. Still, that accounts for only a fraction of the separation costs.
Figure 2: Single-use units, like this depth filter, can reduce change-out times, eliminate manhandling heavy steel housings, reduce cleaning solvent consumption, and cut cleaning validation.
"One of the biggest costs in biotech, even more than the resin, is the buffer ... For example, over the lifetime of a monoclonal antibody process, the Protein A column contributes about $1–2 per gram of protein, which is not a lot in a manufacturing process costing $200–500 per gram," says Pearce. The buffer used in an entire process, though, contributes $20–30 per gram—and can impose a constraint on capacity.
So the challenge is to deal with increasingly efficient bioreactor batches without letting buffer consumption go through the roof, Pearce says. "For me, that's one of the most interesting separation challenges this industry faces, and it's going to be facing it in the next year or two."
How do you do that? "In a sentence," says Gottschalk, "it's the renaissance of protein biochemistry."
Pearce predicts that it will be interesting to watch as the industry looks for lower-resolution separation technologies—whatever they may be—to drive down costs, while more-complex fermentation products drive process developers toward higher resolution to be able to separate product-related impurities (e.g., separate monomers from dimers).
The big question, Pearce says, is, "How do you process a 10,000-L capacity bioreactor with a 5 g/L yield? You have 50 kilos of antibody then. How do you scale up traditional separation technologies to handle that?"
In general, separation costs increase linearly—though some depend on the volume of the feedstock and some depend on the total mass of the target molecule. Polishing steps, for example, increase in cost with feedstock volume, with some economies as the volumes increase. The ability to capture monoclonal antibodies on a Protein A column depends on the column's capacity, so total cost increases with the amount of monoclonal in the feedstock.
Concentrating 50 kg of antibody from a single 10,000-L batch in one run (without splitting the harvest) would require a Protein A column with a 3-m bed height, Pearce estimates, which is not a practical solution. A current 1.4-m production-scale Protein A column can hold $5 or $10 million worth of medium; packing bigger columns for single-run capture increases both the cost and the risk of losing the whole batch.
"So they put in a smaller column and cycle that column very quickly," Pearce says, "but they have to do the whole batch within a shift. They don't want to have the protein exposed to the environment and risk bacterial contamination for that period of time. So again, you have this balance challenge." One could go to the big column with the big expense and the risk of a big loss, or use a smaller column and split the batch, but multiply the risks of operator error and bacterial contamination. (Split-batch processes also raise regulatory and record-keeping questions about what actually constitutes a batch.)
The key to progress, says Pearce, is to look backward to long-established techniques for handling high volumes of small molecules, and to figure out how to adapt them to the biomolecular word: "Can you take advances in processing small molecules, which are easier to work with, and then apply those advances to large molecules?"
Two prime candidates are crystallization and two-phase solvent-solvent extraction, techniques with long track records in small-molecule processing almost unknown in the macromolecular world.
As mainstream pharmaceutical companies delve deeper into biotechnology, says Pearce, the engineering groups in those companies will ask, "Why don't we do it the same way we do small molecules?"
Gottschalk agrees. Because of the massive increases in productivity, we must simplify our systems, he says:
"We have to revisit technology that we know from other areas, like the food and beverage industry or technical enzyme manufacturing, and also technologies that we know from small molecules, like crystallization, extraction, and precipitation. We have to become simpler, not more complicated ... If you look at chromatography today, it's all about managing higher pressures and making systems even more complicated with gradients and continuous systems, and I don't think that this will help us solve the big issues."
Solvent–solvent extraction. Pearce cites work by Millipore's A. Lyddiatt (formerly at the University of Birmingham, UK) and M. Rito-Palmares (now at the Instituto Tecnologico y de Estudios Superiores de Monterrey, in Monterrey, Mexico). They have reported several two-phase extractions for biomolecules, including a generic aqueous two-phase protein extraction of proteins from a complex mixture—in this case, animal blood, partitioning soluble proteins into a polyethylene glycol (PEG)-rich top phase and cell debris into a phosphate-rich bottom phase, with an overall 62% recovery rate (1). The question, says Pearce, is, "Can you build a robust process using that technology?" In other words, it's been proven you can do it, but can you do it reliably on a large scale?
Crystallization. The same questions apply to protein crystallizations.
"First of all, crystallization is well established," says Gottschalk. "It's done with small molecules. It's done with insulin. It is not established with antibodies, but there is a lot of work being performed in the area. At Bayer, experts crystallized a number of antibodies, but on the other hand, so far these steps are not easy to control. The yields are sometimes lousy, and there is still a long way to go."
But the goal is worth working towards. "Crystallization is very smart," says Gottschalk. "You can get rid of all the impurities because crystallization is very specific. And if you identify the right solutions, you can then also crystallize it out of a complex matrix."
There are limits, of course. "If it involves nasty chemicals and toxic substances, such as we would need for crystallography, then crystallization is not acceptable at all" as a production step. Gottschalk says that a systematic, design-of-experiments approach can produce process-scale protein crystallizations that use common buffers.
Researchers at Bayer Healthcare (Wuppertal, Germany), for example, are actively pursuing crystallization as a recombinant aprotinin process step (2).
Though yield and process control remain question marks, "I think that we are getting there," says Gottschalk, "even if it is not yet, standard solution that we can install easily into our process."
Membrane chromatography has clearly arrived, though it is a relatively young technique and is still struggling for acceptance in some quarters.
As with other chromatographic processes, membranes can be applied in two modes: polish (in which the membrane adsorbs impurities, which are then disposed of) and capture or preparative (in which the target molecule binds to the membrane and is later eluted for further processing).
Membrane-chromatographic polishing steps—removing contaminating DNA, viruses, and proteins from monoclonal antibodies and recombinant proteins—have now been incorporated into FDA-approved processes. Martin calls that "a major milestone." At the same time, a surprising number of process designers have trouble accepting membrane chromatography as anything but a filtration step. "Getting conventional chromatography users to accept membrane chromatography into their processes has always been an uphill battle," says John M. Jenco, PhD, senior staff scientist and chromatography applications manager at Pall.
"It's a niche technology," says Gottschalk, "but the niche is not so small, and it's getting bigger." Membrane chromatography excels in polishing applications, where efficiency depends on the amount of mixture flowing through the stationary medium. Membranes offer relatively large pores and high flow rates. Conventional ion-exchange columns have smaller pores and lower flow rates in relation to their total binding capacity. Conventional column polishing requires oversized columns with a lot of unused binding capacity to attain the same throughput. "You're basically designing it for flow rate and not for [binding] capacity. That tends to be fairly expensive," says Martin.
"The higher the dilution is and the larger the target molecule is, the better membrane chromatography is," says Gottschalk. He points out, though, that the technique may not be right for targeting small molecules at high concentrations. "But if you remove endotoxins, DNA, or host cell proteins or any other process-derived impurity in a polishing step in flow-through, then it's a perfect tool with great savings potential," he says.
Preparative challenge. In capture steps, packed column systems have an inherent binding-capacity edge, with higher surface area per unit volume. Membrane applications, especially in single-use or disposable formats, can compensate, in certain situations, with higher flow rates and greater ease of use.
"A conventional chromatography column will always give you a better ratio of bed volume to holdup volume," Jenco says. "It's just inherent in the design of the system."
For resolving closely related species or compounds that elute similarly, resin chromatography will remain the technology of choice, he says. For lower-resolution, bulk-separation steps (whether primary capture or polishing), membrane chromatography offers an attractive option.
The membrane products' greater flow capacities offer particular advantages in the capture of very large molecules. Gottschalk cites blood clotting Factor VIII, a very large protein. Martin points to plasmids and viruses for gene therapy vectors.
Although several vendors, notably Pall and Sartorius, have made single- and multiple-use membrane chromatography cartridges in the 1–500-mL membrane volume range for years, Pall's introduction in April of a 5000-mL membrane volume unit (the "Mustang XT5000" membrane chromatography capsule) rated at flow rates up to 50 L/min, designed specifically for process scale, marked an accelerating phase.
"Not only was capacity an issue for protein-type therapeutics, but also the way that particular membrane chromatography capsules would elute," says Jenco. "The ideal is to have your material elute in no more than 3–5 bed volumes."
In developing the large-capacity membrane chromatography cartridge, engineers sought to improve on existing elution profiles and reduce elution volumes to make the system practical for capture as well as polishing.
"The key ... is not just in the membrane, but in the design and hydrodynamics of the unit into which goes the membrane. Until now, chromatography membranes have simply been fitted into more traditional filter-type units. What we've done now is design the unit that the membrane goes into with an eye toward preparative chromatography applications operating in a bind and elute mode," says Jenco.
Disposability may be the most visible current trend in pharmaceutical manufacturing. This year's Interphex conference saw the introductions of a host of single-use products in membrane chromatography, depth filtration, and tangential flow filtration (as well as bioreactors and mixing vessels).
"This is really a paradigm shift in the industry," says Gottschalk. "Companies like Xcellerex (Marlborough, MA, www.xcellerex.com) are basing whole processes on disposables, at least for earlier phase manufacture and for clinical development, where you need the flexibility and you need to be able to change very quickly."
In general, single-use products are faster to install and remove from the process line, reducing process downtime. Interchangeable modules improve process flexibility, reduce the chance of operator error, and increase operator safety (in reduced handling of such items as heavy stainless steel housings and in reduced exposure to potentially toxic process products during cleaning). They greatly reduce cleaning time, cleaning fluid and chemical usage, cleaning fluid disposal, and cleaning validation requirements. By reducing the number of seals and components made by operators at the pharmaceutical plant, they move the burden of quality assurance back to the separation product manufacturer.
In many cases, as with the Pall process membrane chromatography cartridges or Millipore's newly introduced depth filtration line (the "Pod"), designers freed from the requirements of in-plant assembly have taken the opportunity to reduce internal hold-up volumes and refine fluid flow paths.
Environmental impact. What about the environmental impacts of extending American throwaway culture to the drug plant?
"Environmental impact of disposable manufacturing is pretty simple to assess," says Martin. "Right now, you may have a filter cartridge that goes into a stainless steel housing or reusable resins that go into a column. Disposables replace that stainless steel housing with a polypropylene housing or the column with a capsule a tenth the size, so the amount of plastic is a very small increment. In a lot of cases, the question of environmental impact is a red herring." Yes, there is a slight increase in the amounts of solids for disposal. But there's also a tremendous reduction in the volume of liquids, some toxic, otherwise used for cleaning, rinsing, or equilibration, says Martin, "a ten- to hundred-fold reduction in the disposal of high salt solutions, caustic solutions, and water—often expensive purified water or water for injection can make the small increase in disposable solids relatively insignificant."
"Products that we have been successful with for many years are now being requested as disposables," says Gottschalk. "Take crossflow cassettes. These things are so expensive that users normally want to reuse them several times in order to justify the costs. But now the rules of the game are changing, because of the demands for disposables. So we're now talking about larger numbers, with economy of scale on our side. So we're able to provide single-use products at a much lower cost, and sometimes at only a portion of the original cost."
1. M. Rito-Palomares, C. Dale, and A.Lyddiatt, "Generic Application of an Aqueous Two-phase Process for Protein Recovery from Animal Blood," Process Biochem. 35 (7), 665–673 (2000).
2. J. Peters, T. Minuth, W. Schroder, "Implementation of a Crystallization Step into the Purification Process of a Recombinant Protein," Protein Expr. Purif. 39 (1), 43–53 (2005).