Miniaturization in Pharmaceutical Processing

Published on: 

ePT--the Electronic Newsletter of Pharmaceutical Technology

Pressures to save API are driving formulation developers toward smaller-scale laboratory processes, while pressures to save time put a premium on more- accurate laboratory scale tools.

Pressures to save API are driving formulation developers toward smaller-scale laboratory processes, while pressures to save time put a premium on more-accurate laboratory scale tools.

As a result, noted the moderator of Monday's "Miniaturization in Pharmaceutical Processing," Colin M. Minchom, PhD (vice-president for pharmaceutical development services at Patheon, Mississauga, ON, Canada), formulators are groping toward miniaturized processes that model production-scale with increasing fidelity - despite the paucity of publications and rationalized information.

Peter York, PhD (of the Institute of Pharmaceutical Innovation at the University of Bradford, Bradford, UK) opened the session by formulating the formulation challenge: the industry demands robust, scalable formulations to support early clinical trials and maximize the number of screens from a minimum quantity of product, all while reducing time and cost.

Bruno Hancock, PhD (research fellow at Pfizer, Groton, CT), closed the session by saying, "We are using miniaturized tools on a daily basis to predict formulation performance."

According to Hancock, success may require a creative leap in framing questions and examining assumptions. Most of what we know about the behavior of bulk powders has been handed down from the mining and chemical industries, where bulk means a box-car load. Pharmaceutical materials change behavior with changes of scale, and one of the formulator's first tasks is to ask, What makes sense?

For example, said Hancock, hundreds of suppliers sell laboratory-scale formulation equipment of various types, but few are explicitly designed to reproduce the key characteristics of production-scale machinery. It is time, said Hancock, to "throw out some of the things we've used in the past," and replace them with rationally scaled-down devices directly derived from the process-scale. The result, Hancock noted, need not look or work like its larger cousin, as long as it faithfully mimics the machine's effects on the drug product.

The "Holy Grail" of this rational approach is, of course, accurate simulation - based on either first principles or well-designed and well-tested empirical models.

To the question, How small can you go? in formulation development batches, Hancock said that the answer depends on the effect the researcher is trying to simulate. It makes no sense to try to model powder flow on the behavior of a single molecule. And the modeler must be conscious of how changes of scale dictate material properties over the continuum of size from the quantum level to bulk tablets. For example, the ratio of surface area to volume increases as the scale shrinks; surface effects soon dominate. As a rule of thumb, said Hancock, formulators should strive for methods that yield predictable results from samples on the order of magnitude of a single dose. He offered examples, including predictions of powder flow from small shear-cell data and extrapolations of roller compaction performance from "mini-ribbons."

Advertisement

Participants in the symposium were:

Moderators: Richard Todd Darrington, PhD
Boehringer Ingelheim Pharmaceuticals,

Colin M. Minchom, PhD
Patheon

Miniaturization for Pharmaceutical Formulation and Processing - Can Going Small Resolve Big Challenges?
Peter York, PhD, University of Bradford

Miniaturization Powder and Compact Characterization Techniques: How Small Can We Go?
Bruno C. Hancock, PhD, Pfizer, Inc.

Predicting Process Scale-Up From Laboratory Scale: Principles and Case Studies
Rajeev Garg, PhD, Abbott Laboratories

Is the Small Scale Predictive of Large Scale Performance? - Case Studies From High Shear Wet Granulation and Fluid Bed Processing
Conrad Winters,PhD, Merck & Co.