The Future of Downstream Processing

The author reviews the state of downstream processing and considers potential solutions, including the streamlining of full processes and borrowed technologies.
Apr 30, 2011

The biopharmaceutical industry relies on change to make progress in a commercial environment that simultaneously demands higher productivity, higher quality, and lower costs (1). Recombinant protein titers have improved from tens of milligrams to more than 10 grams per liter during the past 20 years, and at the same time, batch volumes have increased so that industry faces the real prospect of batch yields exceeding 100 kg of protein in the next decade (2). Over the same period, regulatory demands have become more onerous, and the pressure to reduce costs has increased as more biopharmaceuticals come off patent and overseas manufacturers begin to take an interest in Western markets (3, 4). Indeed, the unthinkable is increasingly becoming inevitable—biopharmaceuticals will at some point be regarded as commodities, and manufacturing on the ton scale will be necessary for certain products that are required in large, repetitive doses, such as topical antibody formulations.

There is no doubt that progress in the industry has been impressive, but the polished bodywork of bioprocessing hides an engine of despair that groans under the strain of current demands. Most of the increases in productivity that have been achieved in previous decades have resulted from improvements in the upstream production phase, with more efficient bioreactors and better media formulations sharing the limelight with cell lines that are intrinsically more productive due to the development of more effective screening technologies to identify the most productive clones (5). Downstream processing is now routinely found to be the bottleneck in biopharmaceutical manufacturing because its capacity has not kept pace with upstream production (1). In some cases, the lack of downstream processing capacity can seriously affect the profitability of a new pharmaceutical product and even result in its failure. Manufacturers facing shrinking margins and less bountiful product pipelines are therefore looking for opportunities to increase the efficiency of downstream processing without inflating the cost of goods sold.

Running to stand still

Perhaps the major challenge facing the biomanufacturing industry today is that downstream processing has no economy of scale (i.e., higher titers translate linearly into higher manufacturing costs) (6). Looking back, we can almost consider the first 15 years of biomanufacturing as a golden era, where manufacturers had the luxury of using inefficient processes because the product itself was far more important (3). Most biopharmaceuticals were required in small doses, and demand was sufficiently low to allow plenty of slack in the system. It was also pointless investing in process efficiency when any tweaks and modifications would arouse the suspicious eye of regulators. It was better to let sleeping dogs lie and be satisfied with the status quo. In this environment, innovation was considered a burden rather than a bonus.

Inevitably, this relaxed attitude to process efficiency resulted in an immense amount of waste because up to 50% of product batches failed to meet specifications (3). To address this, FDA ordered that processes be designed with quality attributes taken into account (7, 8). The process was no longer a means to an end to achieve the product, but rather the process became part of the product. As the economic screws began to tighten and demand increased, manufacturers turned to the age-old strategy of scaling up their production to achieve cost savings. It was at this time that industry began to flounder.

Whereas upstream production can be scaled up almost indefinitely by increasing the productivity of cells growing in a bioreactor, downstream processing has limits imposed by physics and chemistry. Downstream processing is driven by the mass of product. If one makes more product per batch, one needs correspondingly larger volumes of buffer, larger storage tanks, preparation areas, larger filters, and most importantly, larger amounts of chromatography media. For the production of antibodies (where Protein A resin is typically used in the primary capture step), the costs of scaling up are in some cases greater than the extra revenue made possible by the increased upstream productivity. Manufacturers find themselves in the paradoxical situation that there is no longer an economy scale in manufacturing, but rather an economic depression reflecting the physical limits that constrain the size of the apparatus used in separations (e.g. chromatography columns and the associated piping, skids, and buffer reservoirs). So far, the extra demand has been absorbed by contract manufacturers offering their spare capacity to fulfill quotas, but this is a short-term measure that cannot cope with the predicted increases in demand as industry faces hundreds of products in clinical development, all requiring at least pilot-scale manufacture according to GMP (9).

How can this productivity dilemma be addressed? With constant scaling up no longer a viable approach, the sector must return to its roots and innovate to succeed. Three solutions are currently being considered by manufacturers, all inspired in some way by the more encouraging regulatory landscape that rewards rather than punishes innovation. This article examines several issues: the streamlining of existing processes, the revisiting of simple technology solutions currently employed in the bulk-chemical industry, and the use of innovative technologies from the biopharmaceutical research sector. These innovative technologies have the potential to introduce game-changing processing options into an industry still mired in technologies that were the state of the art 20 years ago. On a cautionary note, however, technologies from the research sector can fail, and the rash adoption of new and untested technology platforms can punish an eager company seeking innovative solutions. This is the new dilemma in downstream processing.