Speeding Up Formulation Development

Published on: 
Pharmaceutical Technology, Pharmaceutical Technology-09-01-2017, Volume 2017 Supplement, Issue 4
Pages: s4-s11

QbD principles and strategic thinking can reduce the time required to optimize formulation. 

To help pharmaceutical and biotech companies improve their operations, FDA has been promoting the use of quality by design (QbD) (1, 2). Much has been written about the QbD concept of a “design space” from a manufacturing process perspective. Little has been published, however, about using QbD in formulation development, including the development of formulation design spaces. 

Formulation studies typically involve the optimization of multiple ingredients including the API, lubricants, binders, and disintegrants. Such optimization is a difficult challenge when a large number of components are involved, which is typically the case. 

An additional challenge is that, in many formulations, the amounts of components in the formulation must add up to 100% or some fixed amount. A pharmaceutical formulation, for example, might consist of: 15% API, 35% lactose, 45% microcrystalline cellulose (MCC), 4% Starch, and 1% magnesium stearate. The percentages of the ingredients add up to 100%; thus, when levels of one or more components are changed (i.e., increased or decreased), the percentages of one or more of the remaining components must be changed (i.e., decreased or increased by a corresponding amount). 

The statistical design-of-experiments approach, DoE, which is at the core of QbD, has the flexibility to deal with multiple components that involve constraints of many different types. In the statistical approach, a series of formulations is created and tested in a planned sequence. The levels of the candidate components are varied in a blending design and the performance of the formulations is measured. A model is fit to the data and critical components, and then blending characteristics are identified. 

Response surface contours are examined graphically and analytically to determine the design space, the region of the best values of the response that meet specifications. Additional confirmatory formulations are typically tested to verify the model predictions. The statistical approach has many important benefits, as discussed by Montgomery (3), Snee and Hoerl (4), and many others. As a result, formulation understanding is greatly increased as well as the probability of successful formulation development (4-7). This article discusses some of the issues involved, and outlines a strategy for dealing with them. To clarify abbreviations in the text, MCC refers to Avicel PH105; CP is Hiviswako 104; and PVP is Providone K90.

Thinking strategically

Use of DoE speeds up formulation development, but more is needed and more is possible if one thinks strategically about formulation development. 

Over the years, it has been recognized that experimentation is more effective when it is approached with a strategy in mind. To be effective, any strategy must recognize that the design, or sequence of designs, should match the experimental environment; that experimentation is sequential; and that the DoE tools must be embedded in the strategy and linked and sequenced to guide the experimenter. 

This experience leads to the following principles, which can enhance experimental strategies:

  • Plan ahead; define the series of experiments needed to satisfy the objective of the program.

  • At the beginning, include (or at least consider) all factors (Xs) that may possibly be important. Recall the Pareto effect, and the fact that most of the variation will be caused by a small subset of the factors so that, as one moves through the experimentation, the important factors will be discovered and tested further in later experiments.

  • Don’t spend all resources on a single experiment: an issue is rarely resolved in a single experiment.

A strategy that uses these principles was developed at DuPont in the 1960s, and offered in public workshops beginning in the 1970s. In the case of formulation, this strategy identifies two experimental environments: screening and optimization. The objective of each of the phases and the designs used is summarized in Table I.

Briefly stated, the screening phase explores the effects of a large number of ingredients (components) with the objective of identifying a smaller number of components to study further in optimization experiments. Additional screening experiments involving additional factors may be needed when the results of the initial screening experiments are not promising. On several occasions, the screening experiment solves the problem.

 

Predictive model development

In the optimization phase, a predictive model is developed for the system that can be used to find useful operating conditions using response surface contour plots and perhaps mathematical optimization. The result is the design space.

The end result of each of these sequences is a completed project. There is no guarantee of success in a given instance, only knowledge that the DoE strategy will “raise your batting average” (8). The strategy used depends on the experimental environment, which includes the objectives of the experimental program, the nature of the components (Xs) and responses (Ys), resources available, quality of the information to be developed, and the theory available to guide the experiment design and analysis. A careful diagnosis of the experimental environment along these lines can have a major effect on the success of the experimental program.

Martinello et al. (9) presented a pharmaceutical tablet study that investigated the formulation involving the compound paracetamol, which was known to have poor flowability (a measure of  how well the materials flow through the tableting equipment) and compressibility properties. The study involved seven ingredients, varied over the ranges that are shown in Table II

Advertisement

Nine responses were measured. The focus of the following discussion will be for illustrative purposes, the repose angle response (Table III). A 19-blend extreme vertices design shown in Table III was used to design the formulations to be tested. This design was selected using the D-Optimality criterion. As in any screening experiment, it is necessary to know what components are most important as measured by their effect (positive or negative) on the response. This information will enable one to answer the following questions:

  • Can we select a formulation based on the results of this particular experiment?

  • Is additional experimentation needed? If so, which components should be the focus of future experimentation?

It is not unusual for a screening experiment to provide the information needed to choose a desirable formulation. Such was the case in this particular study. If additional experimentation is needed, the screening experiment results provide a firm basis for designing the optimization experiment. In this case, a seven-term linear blending model of the following form was developed (Equation 1). 

Y = b1X1 + b2X2 + b3X3 + b4x4 + b5x5 + b6X6 + b7X7  

[Eq. 1]

where Y represents the response of interest, in this case Y=Repose Angle; X1, X2 …. X7 represent the levels of the components, and the bs are regression coefficients associated with the Xs. 

Martinello et al. (9) developed equations such as Equation 1 for each of the responses and used the equations to develop an optimal formulation, which, when tested, produced measured responses that were very close to those predicted by the linear blending model. Thus, in this case, additional experimentation was not needed.

Calculating component effects

The principal output of a screening experiment is the component effects. The component effects shown in Table IV for the repose angle response were calculated using methods proposed by Cox (10). The linear blending model defined by Equation 1 fit the data with an adjusted R-Square value of 82%, a respectable value for a screening experiment. The overall model gave a statistically significant fit to the data (p=0.000).

In Table IV, note that the Microcel and Aerosil have a significant effect on the tablet repose angle. The Aerosil effect is negative and much larger than the Microcel effect, which is positive. These effects are seen in the component prediction profile plots in Figure 1. Note that in Figure 1 the effect lines for all the components except Microcel and Aerosil have a very small slope, indicating no effect.

Interpreting component effects

When evaluating component effects to better understand formulation systems, the following are recommended:

  • Evaluate the component effects using the Cox effect directions to assess the nature and magnitude of the component effects as shown in Figure 1 and Table IV.

  • Study the component effects to identify components that may have equal values indicating similar blending behavior.

  • Assess whether similar blending of these components is supported by subject-matter science and business knowledge. Components with effects that are not statistically significant identify components that have no effect. The response to finding components that have no effect will depend on the objectives of the experiment. Options include setting the component with no effect at any desirable level within the range studied. If zero is at or near the lower end of the range, one can consider removing the component from the formulation. In all cases, the selected action should take into account subject-matter science and business knowledge (8).

 

A formulation optimization case study

Hirata et al. (11) present an optimization case study focused on the development of a three-component sustained release tablet of chlorpheniramine maleate. The objective of the study was to find a formulation with a release rate >30 units. The ranges of the three components shown in Table V suggests the use of an extreme vertices response surface design. 

These component ranges produce an experimental region that has six vertices (Figure 2). The region is irregular, and there are two pairs of vertices that are close together (points 2 and 3 and 4 and 5, Table IV). Figure 2 shows the six vertices, the six-edge centroids, and the overall centroid, as well as the response surface contours that will be discussed later.

In a formulations experiment design made up of the six vertices, the six-edge centroids (including the two short edges) and the overall centroid, which was tested three times for a total of 15 blends, were evaluated. The design and release rate data are shown in Table VI. In this table, five of the design blends (5, 6, 10, 11, 12) have release rates > 30. This provides the scientist’s assurance that the objectives of the study will be met, even before any statistical modeling and analysis are done.

Plots of predicted release rate versus component levels for each component are shown in Figure 3 (sometimes referred as the prediction profile). In this figure, CP has a negative effect while MCC and PVP have positive effects. The component linear effects as measured along the Cox axes show that the CP effect (-16.5 units) is dominant as compared to the MCC and PVP effects which are 7.9 and 9.5 units, respectively. In formulation optimization studies, the quadratic model is the model of choice, particularly at the beginning of the analysis process. The three-component quadratic blending model of the form is shown in Equation 2

Y = b1X1 + b2X2 + b3X3 + b12X1X2 + b13X1X3 + b23X2X3

[Eq.  2]

Where Y represents the response of interest. In this study case, Y=Release Rate; X1, X2, and X3 represent the levels of the components, and bs are model regression coefficients.

Evaluating fit 

The fit of the quadratic model (Equation 2) gives a good fit to the data as judged by the adjusted R-square value of 94%. As expected, there was some significant curvilinear blending involving MCC and CP and PVP and CP. 

The centroid blend was evaluated three times. These replicate blends enable us to test the lack-of-fit of the model. This test was not significant (p=0.638), providing further indication of the adequacy of the fit of the quadratic model. 

The next step is to construct the response surface contours to identify the formulations that will produce release rates > 30 units. In Figure 2, over the design space, levels of CP < approximately 0.25 will produce release rates > 30. Figure 2 shows contours for release rates of 18, 20, 24, 28, and 32; associated with decreasing CP moving left to right. The curved contour lines result from the curvilinear blending noted above. Components that blend linearly produce straight line contours. The fit of the model was checked by running two confirmation experiments (Table II, blends 16 and 17), which produced “observed” and “predicted” release rates of 32.7 vs 30.0 and 20.7 vs 20.0, respectively. These accurate predictions added further evidence of the adequacy of the model.

 

The right data in the right quantity at the right time

The strategy of formulation development described in this article enables formulation scientists to speed up formulation development by getting the right data in the right amount and the right time. In the process, the formulation scientist’s “batting average” is raised (10). 

Proven techniques, and available software

The strategy and methodology outlined here have been used successfully, and tested over time. They work, and there is software available to ease the computational burden, as well as to create graphics that aid in analyzing the data and visualizating the results. In short, the approach seems worthy of consideration by formulation scientists. 

References

1. ICH, Harmonised Tripartite Guideline: Pharmaceutical Development, Q8, Current Step 4 Version, Nov. 10, 2005.
2. R. Snee, “Building a Framework for QbD, Building a Framework for Quality by Design” pharmtech.com, October 2, 2009.
3. D.C. Montgomery, Design and Analysis of Experiments, 8th Edition (John Wiley and Sons, New York, NY, 2013.)
4. R.D. Snee, and R. W. Hoerl, Strategies for Formulations Development: A Step-by-Step Guide using JMP (SAS Press, Cary, NC, 2016.)
5. R.D. Snee, “Understanding Formulation Studies,” Technometrics, 37(1), pp.131-132, 1995.
6. R.D. Snee, “Understanding Formulation Systems-A Six Sigma Approach,”Quality Engineering, 23 (3), pp. 278-286, 2011.
7. R.D. Snee and G. F. Piepel “Assessing Component Effects in Formulation Systems,” Quality Engineering, 25(1), pp.46-53, 2013.
8. R.D. Snee, “Raising Your Batting Average: Remember the Importance of Sequence in Experimentation,”Quality Progress, December 2009, pp. 64-68. 
9. T. Martinello, et al., Int. J. Pharmaceutics, 3(22), 87-95, 2006.
10. D.R. Cox, “A note on polynomial response functions for mixtures,” Biometrika, 58(1), pp. 155-159 . 1971 .
11. M Hirata, et al., “Formulation Optimization of Sustained Release Tablet of Chlorpheniramine Maleate by Means of Extreme Vertices Design and Simultaneous Optimization Techniques,” Chemical Pharmaceutical Bulletin, 40(3), pp.741-746, 1992.

Article Details

Pharmaceutical Technology
Supplement: APIs, Excipients, & Manufacturing 2017
Vol. 41, No. 9
September 2017
Pages: s4-s11

Citation

When referring to this article, please cite it as R. Snee, "Speeding Up Formulation Development," Pharmaceutical Technology APIs, Excipients, & Manufacturing 2017 Supplement (September 2017).

About the Author

Ronald Snee is principal of Snee Associates, LLC, based in Newark, Delaware. He worked at DuPont for 24 years, and then as a consultant for companies that include Tunnell Consulting. He is an adjunct professor in the pharmaceutical programs at Temple and Rutgers Universities. He received his BA from Washington and Jefferson College and his MS and PhD degrees from Rutgers University, and can be reached at Ron@SneeAssociates.com.