OR WAIT null SECS
FDA expects a firm that is subject to GxP to develop a risk evaluation of its product and to then mitigate the identified risks. Identified risks may be addressed by technical fixes that effectively eliminate the risks or reduce the likelihood of occurrence and/or severity of consequences to acceptable levels.
In September 2003, the US Food and Drug Administration (FDA) issued the new 21 CFR Part 11 scope and applicability final guidance document - partly in response to concerns expressed by the industry that the breadth of applicability and the cost of Part 11 compliance have hindered the use of new technology. This guidance states that records must still be maintained in compliance to the underlying predicate rules, but that FDA will take a "risk-based" approach to enforcing compliance to some of the technical controls for Part 11 (such as validation, audit trails, record retention and record copying).
FDA will also include Part 11 in its formal review of current good manufacturing practice (cGMP) regulations and follow a more subjective course in taking regulatory action for compliance. The intent is to get back to its regulations for "best practices" (together referred to as GxP), or predicate rule fundamentals for the interpretation and enforcement of Part 11. These fundamentals involve systems for generating electronic records required in support of the GxPs that encompass good clinical practice (GCP), good laboratory practice (GLP) and cGMP.
Prominently included in a firm's Part 11 remediation plans should be a risk analysis to account for how various systems that generate regulated electronic records could potentially affect the consumer's safety. Although there are many definitions of "risk," depending on one's industry and perspective, a useful description comes from the ISO/IEC Guide 51:1999: "A combination of the probability of occurrence of harm, and the severity of that harm." Whether applied to Part 11 or to other safety-related aspects of FDA-regulated products, the regulatory perspective for risk should focus on risk to product quality and/or public safety. Such products would obviously include foods and cosmetics, blood products and drugs, medical devices, and any other regulated products that are ingested, consumed by or applied to a living creature (human or animal). When a system generates electronic records that can greatly impact product safety and quality, or the integrity of regulated records, it is considered a "high-risk" system, and the technical controls of Part 11 that protect electronic record integrity would still apply. Otherwise, the system is considered to be "low-risk" and FDA will simply enforce the GxP requirements for protecting record integrity.
For manufacturers of drugs and medical devices, a risk-based approach to protecting product quality and public safety stems logically from the fact that Part 11 was predicated on the GxPs. For example, FDA expects a firm that is subject to GxP to develop a risk evaluation of its product and to then mitigate the identified risks. Identified risks may be addressed by technical fixes that effectively eliminate the risks or reduce the likelihood of occurrence and/or severity of consequences to acceptable levels. Risks for which there are no technical fixes may be addressed by including warnings in the accompanying product labelling. Other residual risks following mitigation may remain so minimal as to intrinsically be left at acceptable levels. Figure 1 lists some of the high versus low risk systems that generate GxP records, according to the recent International Society of Pharmaceutical Engineers (ISPE) white paper submitted to FDA, which addresses the risk-based approach to Part 11 compliance ().
In fact, applying a risk-based approach on Part 11 compliance should be nothing new for regulated firms. A similar approach outlined in the quality system regulation (QSR) requires a company to perform a risk analysis of the various record generating and record-keeping systems maintaining electronic records and/or implementing electronic signatures. Such an analysis would also address a system's interactions with other interconnected systems, allowing the company to determine which records have high-impact consumer safety issues. The firm would then evaluate the effects of identified risks and rank them according to their criticality.
Figure 1 A comparison of high and low risk systems that generate GxP records.
A practical example of applying risk analysis to Part 11 remediation is the use of quality data from a Part 11-compliant database for inclusion in a corrective and preventive action (CAPA) report. A spreadsheet typically generates this report, and whereas the spreadsheet formula should still be validated according to GxP, the overall relative risk to public safety is low. Therefore, the typical Part 11 technical controls (for example, audit trails) would not be required to protect the integrity of the spreadsheet. The spreadsheet itself, however, must be maintained and utilized in a current, validated state and be GxP-compliant.
Alternatively, adverse event reporting and clinical trial data that fall under GCP regulation can have a potentially high impact on public safety and the quality of a regulated product. Programs that analyse and visualize clinical data subsequently have an impact on record integrity. Such systems would be considered high risk and therefore should continue to incorporate the technical controls for Part 11 compliance as well as maintain predicate rule compliance.
In summary, Part 11 remediation has not changed for high-risk, GCP-related systems such as adverse event and case report forms (CRF) data management systems; statistical analysis software (SAS); web trial systems; electronic patient diaries; patient randomization; and trial supply labelling systems. Both GCP and Part 11 definitely apply to these high-risk systems. In addition, FDA's Guidance to Industry for Computerized Systems Used in Clinical Trials remains in effect and applies to these systems as well.
Keep in mind that whereas Part 11 is an enforceable law, an FDA guidance document is not a law. Guidance documents present FDA's current thinking on a subject and are only a recommendation on how to proceed in addressing a law's requirements. Guidance documents are not binding on either the industry or the agency.
There are many risk assessment protocols or methodologies available originating from various industries (such as automotive, aerospace, defence and food). It behoves regulated manufacturers to build risk analysis into their quality processes from the start. FDA-regulated firms have commonly utilized several of these methodologies in the past. What follows is a discussion of the most common risk analysis methodologies, which is by no means all-inclusive.
FTA. A fault tree analysis (FTA) is a deductive, top-down method of analysing system design and performance. It involves specifying a "top event" to analyse, then identifying all of the associated elements in the system that could cause that top event to occur. Fault trees symbolically represent the combination of events resulting in the top event.
Events and gates in FTA are represented by graphic symbols such as AND/OR gates. Sometimes certain elements or basic events may need to occur together for that top event to occur. In this case, these events would be arranged under an AND gate, meaning that all of the basic events would need to occur to trigger the top event. If the basic events alone would trigger the top event, then they would be grouped together under an OR gate. The entire system, as well as human interactions, would be analysed when performing a FTA.
Figure 2 A diagram of FMECA and FMEA analyses.
FMEA and FMECA. Failure mode effects and criticality analysis (FMECA) originated during the 1950s in the military and aerospace industries. Its basic concept is to categorize and rank potential process failures, or critical issues, and then to target the prevention of those critical issues (Figure 2). It is important to prioritize the potential failures according to their risks and then implement actions to eliminate or reduce the likelihood of their occurrence.
Failure modes and effects analysis (FMEA) originated in the 1960s and 1970s and was first used by reliability engineers. FMEA involves evaluating documentation of potential product or process failures. Actions are then identified that could eliminate or reduce the potential failures. It is a system of various group activities provided through such documentation of potential failure modes and their effect on product performance. FMEA is a tool that should identify failures before they occur, identify appropriate risk mitigation measures to prevent or otherwise control the failure, and ultimately improve product and process design.
An assumption is made that all product and process failures (and the actions required to control these failures) are predictable and preventable. Surprisingly, organizations still frequently experience predictable and preventable failures with costly consequences. These failures can lead to product recalls, death or injury, poor quality and unanticipated cost. Although the aerospace and defence industry have used FMEA for decades, FMEA has recently been making significant inroads into the biomedical device industry.
HACCP. A common methodology embraced by FDA is hazard analysis and critical control points (HACCP). HACCP received its start in the food arena and was initially developed in the 1960s by the Pillsbury Co., NASA and Natick Labs for the space programme to reduce the need to test the finished packaged product. Pillsbury made a commitment to improve on existing "good quality programmes" by using techniques developed to supply food to NASA's astronauts.
In 1996, the US Food Safety and Inspection Services Task Force (FSIS) developed an HACCP-based regulatory proposal that became the pathogen reduction/hazard analysis and critical control point systems (HACCP) rule. In this rule, FSIS determined that its food safety goal was to reduce the risk, to the maximum extent possible, of food borne illnesses associated with the consumption of meat and poultry products. This involved ensuring that appropriate and feasible measures were taken at each step in the food-production process at which hazards might enter - developing procedures and technologies to prevent or reduce the likelihood of hazards occurring.
HACCP is made up of seven basic principles (see sidebar "The seven basic principles of HACCP") that enable the production of safe products. This is determined through the analysis of production processes, followed by the identification of all hazards that are likely to occur. Critical points at which these hazards may be introduced into the product are identified, followed by the establishment of critical limits for control at those points.
These prescribed steps are verified and finally methods are established by which the firm and the regulatory authority can monitor how well process control through the HACCP plan is working. Overall, risks are minimized by proper implementation of HACCP. It is understood that implementation of HACCP does not mean the absolute elimination of risks, but rather, one can prevent and reduce hazards to a degree that minimizes the risk to an acceptable level.
Risk-based compliance can analyse computer systems and information handling processes to assess not only risk, but also the cost of converting paper-based information to an electronic format. A good place to start is by performing a system assessment, then plotting your various systems and processes on a simple x-y matrix that measures, from low to high, the risk to security of the data (x-axis) and the cost of remediating (y-axis). Then prioritize those systems and processes needing upgrades or replacement, based on where they fall in the matrix. Computer systems, for example, that fall in the "high data security risk, low conversion cost" area of the matrix could be targeted first for compliance validation.
Having had to address Y2K issues, many organizations have already generated an inventory of all their computer systems and then (hopefully) evaluated them to determine the potential risk in the event of a computer error or failure. Companies with cost considerations and many non-compliant computer systems must, of course, prioritize which ones to remediate first. To sanely tackle this problem, one must estimate the data security risk for each system and the cost of Part 11 validation - then plot that system on a matrix. Systems and processes falling into the high risk category should get top remediation priority.
The following is a typical criticality assessment ( high to low ) for non - clinical laboratory systems ( source Clarkston Consulting , 2003 ):
1. Systems for quality processes and standard operating procedures (SOPs).
2. Lab spreadsheets and databases for data collection.
3. Systems for other R&D data.
4. Central database for inventory management.
5. Systems for liquids processes.
6. Systems for company financials.
7. Systems for customer relationship management.
8. Systems for packaging.
9. Systems being decommissioned.
21 CFR Part 11 is not going away -FDA intends to enforce it. What has recently changed is the adoption of a narrower scope for the rule, a new understanding of enforcement discretion and the application of a risk-based approach to compliance. The important thing to remember about choosing a risk assessment protocol or methodology for Part 11 remediation is to use common sense. All of the methodologies mentioned in this article have this basic premise: analyse processes, identify where in the processes is the greatest potential for risk (to product quality and ultimately to public safety), and put in place ways to mitigate those risks and document the entire endeavour. Whether adopting a standard risk assessment methodology like FEMA or HACCP, or developing one's own, the important thing to remember is that FDA will show enforcement discretion if a well-documented plan is in place and true progress is being made against it.