OR WAIT null SECS
A look at the true cost-drivers of cell-culture production.
Exciting developments have taken place during the past decade, revolutionizing mammalian cell culture and production of monoclonal antibodies (mAbs). Product titers have increased from levels that would have been economically unacceptable—if there had been a workable alternative—to levels considered quite satisfactory. Today's early commercial antibody production processes can deliver 3–5 g/L of bioreactor volume compared with the meager 300–500 mg/L of years ago. Even greater capacities are promised.
This growth would remain impressive even if those who claim the improvement to be 1000-fold got it wrong by a factor of 100. The production processes of the 1990s used purification resins at a capacity of 15–18 g/L column volume. Today, we have capacity that is 3-fold higher, better flow properties, and longer lifetimes. Overall, industry has seen a 10- to 20-fold increase in productivity, which is quite similar to recent cell-culture improvements. In addition, this new capacity can help reduce variable costs to as low as 10–30% of first-generation levels for downstream processes alone. Although the latest resin generation has not been implemented into very large production processes for antibodies yet, it is entering new processes everywhere. Industry may assume, therefore, that current technology will cope for some time to come.
Most experts agree that the designers of manufacturing facilities did not anticipate the recent developments in cell culture that dramatically increased product yields. As a consequence, downstream processing equipment and associated tankage capacity—and to a lesser extent, downstream technology—will constitute a processing bottleneck if product titers continue to increase. But will they? Should they?
The cost-reduction curve flattens out at capacities of 3–5g/L, and there are no signifcant gains above capacities of 6–7 g/L. On the contrary, new problems are created further downstream: Manufacturers will need to increase expensive filter areas and change purification processes to new resins. Alternatively, they could scale up, which brings significant capital cost and space constraints. Some manufacturers claim a paradigm shift away from current downstream technology is just under the radar screen. I disagree and here's why.
Because increased product titers, alongside new facility constructions have effectively increased manufacturing capacity, many mAb manufacturers are reporting significant overcapacity. This overcapacity is likely to affect the fixed-cost burden on manufacturing costs. A normal percentage in a new facility would be 60–70% of total cost of goods sold (COGS). Fixes to the downstream process and its variable cost will at best yield cost reductions in the low single-digit percentage range or will require significant capital for facility refurbishing.
Risks for failure of such an undertaking are not negligible. There are few technical alternatives capable of keeping batch failures low and helping to ensure product and patient safety while also offering a true shift in economy. Existing alternatives are likely to be found by competitors first, and thus available only at high licensing costs.
Such de-bottlenecking hardly deserves the name "paradigm shift," even if achieved with new technical alternatives. The cost problem simply cannot be addressed at the right level. Some may argue that the plasma industry, with its low-priced protein therapeutics intravenous immunoglobulin (i.v.-IgG) and human serum albumin (hSA), has successfully used alternative separation technology for decades to make multiton-scale production possible at low cost. But since the 1980s, the plasma industry has added chromatography steps to its processes to improve previously insufficient yield, purity, and safety. The leader in this industry today, Australian-based CSL, was the first to install a plasma fractionation plant that was based entirely on chromatography.
In addition, the production of insulin, another ton-scale protein drug, has undergone a complete processing renovation from low-yield purification processes that involve 15–20 steps and use chemical reaction and precipitation to a 4–5 step chromatography with great economical improvement. No logical person in the biopharmaceutical industry would want to return to those "good old technologies."
At present, I can see two possible scenarios, very similar in nature, in which the cost of making mAbs becomes out of reach. For most other situations, cost is not a driver for technology change (1).
In the first scenario, a company operating Phase III clinical-trial manufacturing at full scale (10,000–15,000 L bioreactors or larger) does not keep its process scale small or sufficient enough to make only the amount of product they require for the study. The quantities of raw materials and consumables to be purchased at scale, as a result, constitute a significant cost to the company's research and development (R&D) budget. Worse, resins and other consumables cannot be used for their full lifetime.
In the second scenario, a company producing small quantities (as much as a few hundred kilograms) in the same size bioreactors as the first company faces the same issue. Very few batches are run, and the resins are not used for their full lifetime. Pricing of consumables, however, follows the economic value of the products where lifetime is a significant factor. In large facilities, things can get quite expensive in this case.
Will these factors lead to the end of the dinosaurs in biopharmaceutical manufacturing? Or will they serve as the beginning of a single-use age?
Most forecasters agree that very few new antibodies will require large, ton-scale production, or even more than a few hundred kilograms. Product titers at 5 g/L will allow companies to manufacture annual quantities in bioreactors as small as 500 L, certainly in 1000 L. This scale isssue cannot be ignored much longer. Companies will need to adopt the entire scale-up strategy, including the planning of clinical manufacturing. Facilities of the future could be built with a few different bioreactor sizes, none larger than 2000 L. Most future facility models may also include disposable equipment, assuming the equipment can play an important role after a thorough cost-benefit analysis is completed for potential use. In some cases, the benefits of disposables are small or nonexistent; individual case studies are necessary. There is also a scale limitation to consider when thinking about using disposables.
Regardless of the presence of disposables, most important in new facility designs will be the responsiveness of the facility to changing demands and its ability to accommodate new products at different scales while addressing cost concerns.
The real paradigm shift will happen when companies move new production away from the dinosaur bioreactor farms and leave it to the first-generation legacy processes that industry has agreed may be suboptimal but "best to keep." Technical bottleneck issues and related cost concerns would disappear momentarily.
This true "paradigm shift" would represent a fundamental change in manufacturing strategy. This article is not trying to talk industry into keeping the status quo. Rather, it's about looking for improvements in the right places and following the good old 80:20 rule, in which you solve 80% of the problem first and thereby gain 80% of the benefit. Relevant cost reductions are not likely to be found in alternative technologies alone. Suboptimal, inherited facilities and process designs, as well as overdimensioning of production, are today's true cost drivers.
Günter Jagschies, PhD, is a senior director of strategic customer relations at GE Healthcare Life Science Biotechnologies R&D, Björkgatan 30 S-75184, Uppsala, Sweden, tel. +46 18 6120880, fax +46 18 6121863, firstname.lastname@example.org
1. B. Kelley, "Very Large Scale Monoclonal Antibody Purification: The Case for Conventional Unit Operations," Biotechnol. Prog., 23, 995–1008 (2007).