This article has been published simultaneously in the April issues of Pharmaceutical Technology Europe and Pharmaceutical
The development of robust and rugged analytical methods is a vital part of the drug development process.1,2 It is essential to ensure that levels of critical quality attributes (CQAs) present in batches of drug substance (this paper
focuses on low level impurities in a starting material used in the manufacture of the API) or drug product are accurately
and precisely quantified, thus assuring patient safety. The benefits of reliable methods in a manufacturing environment can
be measured in terms of reduced process down time and reduced overall costs, 3 as the number of atypical results/method incidents will be minimised. Furthermore, robust analytical methods form a key part
of the control strategy in Quality by Design (QbD).4,5
The goal of robustness testing is to evaluate the effect of small deliberate changes in method parameters (e.g., temperature,
% organic modifier and pH of mobile phase, which are internal to the written method) on the qualitative and quantitative abilities
of the method. Ruggedness testing, 6 however, evaluates whether noise factors4 (e.g., different analysts, different instruments and different lots of reagents, which are external to the written method)
have an effect on the reproducibility of the method.
The use of experimental design approaches (Design of Experiment; DoE) has been used for analytical work (in particular for
determining robustness of analytical methods) for more than three decades. Validation of analytical methods in late phase
development has traditionally involved performing robustness testing as one of the last activities after characteristics,
such as specificity, linearity, range, accuracy, precision and sensitivity have been studied. Although, this is the standard
approach reported in the literature and in ICH Q2 guidance,7,8 there are some associated risks with it. For instance, the identification of method parameters that can give rise to non‑robust
behaviour leading to poor method performance may only be discovered at the end of the validation study. Moreover, if there
is a need to redevelop any part of the method, it is likely that the whole validation exercise would need to be repeated.
In the time‑constrained environment of drug development where speed to market is an important criterion of success, delays
of this type could be costly.9
DoE is not always used in the development of analytical methods;10 systematic approaches to HPLC method development, which involve the scouting of key components of a HPLC method (e.g., column,
pH and organic modifier), are often used.11 DoE can be used to further optimise method conditions resulting from such scouting work, but this may not always be required.
The effects of method parameters that have not been studied as part of method development are unknown/unquantified until robustness
testing is performed. Therefore, it is desirable to check robustness for such methods in advance of final method validation.
Part of QbD involves identifying the highest risk parameters that affect method performance.4 Risk assessment tools, such as fishbone diagrams,12 failure mode and effects analysis13 and prioritisation matrices offer an easy scientific means of achieving this. A key part of the process involves applying
scientific knowledge, past experience and judgement to identify, score and prioritise method risk parameters.
The use of a risk-based approach to decide which parameters should be studied in a robustness exercise has been well documented.4 A variant of the prioritisation matrix has been used in this paper to rank method parameters in order of their likely impact
on key method responses (e.g., resolution of a critical peak pair and sensitivity to detect a key impurity). The data from
the risk assessment have then been used to construct a reduced form of a fractional factorial design or ‘Reduced Method Robustness’