Computer Validation Master Planning "Validation Strategies"

Published on: 
Pharmaceutical Technology, Pharmaceutical Technology-11-01-2005, Volume 2005 Supplement, Issue 6

Coordinating validation efforts throughout an organization requires an accurate and timely overview and a validation master plan (VMP).

During the last few years, industry has been focused on having validated computer systems. But, computer validation is not a new task, and the regulations demanding validated computer systems are not new either. The big questions always asked are, "how much validation is enough" and "which systems do we have to validate?" The answers to both of these questions are contained in the activity of planning validation for the organization. Although the answers may sound simple, the solutions are often difficult and tedious to carry out.

Both the European (1, 3) and US regulations (4) require planning of validation activities. Some of the regulations imply the planning of validation activities, where others explicitly refer to planning documents, such as in the European legislation, Annex 15. Guiding documents, such as the Good Automated Manufacturing Practice (GAMP) 4 (11), set forth requirements for planning documents. Because the European Union (EU) regulations and the GAMP guidance are the most comprehensive on the subject, these are the documents on which the interpretations in this article are based.

Traditionally, planning validation activities has been dealt with on a case-by-case basis by producing validation plans for each system or for each validation project. In order to coordinate validation efforts throughout an organization and to plan activities and budgets at the highest level, management requires an accurate and timely overview. Fulfilling this need demands top-level planning documents, which, in this article, will be called the validation master plan (VMP).

It is worth remembering that planning validation activities and creating a VMP are not exclusive to computer systems, but are meant to be included for all regulated systems and processes in some form. Compliance for all systems should create business benefits as processes are continuously optimized and downtime is reduced.

This article discusses regulatory requirements and gives an interpretation of how they can be implemented by defining an approach to a VMP. This article focuses on computerized systems, but also covers the subject of the VMP from a general perspective. Many of the statements in this article are valid for general validation purposes.

Planning for validation

Validation activities should be planned. This statement makes sense; one would not be able to perform any activity satisfactorily without having a predefined goal and a prescribed way of completing the activity. It also makes sense from a business point of view because resources should be allocated, and validation activities should not get in the way of normal business flow and production (or vice versa, depending on your function and responsibility).

The terminology that is used in the validation field is not as uniform as one might wish. The terms are very much subject to interpretation. In fact, the author of this article has seen a difference between the way industry terminology is used in the EU and the way it is used in the US. This article will attempt to settle one of the arguments by setting some of the definitions straight and by giving some interpretations that we can all live with and benefit from.

The US Food and Drug Administration calls for a planning document in its guidance for process validation (9, 7). This planning document has been implemented by industry and is widely used for every validation project or system to be validated. The EU regulations (3) also reflect this planning for validation as an essential part of the regulations, but the system level or project level is only one tier in the organization.

Validation activities should be planned in several layers, and adequate procedures and controls must be in place for each of these layers. In order for any organization to work successfully, strict and effective planning is required on all levels. The goal for an organization must be to have the appropriate level of control for each layer of validation activity. That is why management calls for a tool to create an overview of the validation activities required within the organization in order to plan and prioritize resources. This tool is what in Europe is known as the validation master plan (VMP).

From the literature, it is evident that the term VMP is often used in the US for the planning of validation activities of a specific project or system (15, 10, 14, 12, 18). The EU regulations (3), the Pharmaceutical Inspection Convention/Cooperation Scheme (PIC/S) Guidance (16,17), as well as the GAMP 4 Guide (11), however, use the term for a higher level document covering the entire organization, and this is supported by Kropp as well (13).

Where the US regulations are a bit vague on how companies should document control of their validation planning, the EU regulations and GAMP 4 are much more specific on how to accomplish the expected level of control.

Considering the different layers of control, it seems sensible to develop some kind of document hierarchy that reflects the need for controls in the different layers. Such a model would take into consideration the difference in the level of detail between the layers and within the multitude of the documents.

The more detail included in a layer of documents, the more documents will be needed to contain the level of detail wherein all the systems in the organization must be described. This is reflected in the model in Figure 1, in which the triangular shape of the hierarchy shows that one validation plan for a project or system can cover many protocols, and one VMP can cover many validation plans.

Figure 1: Document hierarchy of validation.

In the EudraLex Vol. 4 Annex 15 (3) there is a very good list of what should be included in a VMP, and this is—almost to the letter—reflected in the GAMP 4 (11). The contents of these lists can be seen in Table I.

Table I: Required contents for a validation master plan.

This outlines the VMP as a high-level planning document needed by management to map out activities and resources. The VMP sets the scene for all of these activities by providing a procedural framework within which validation activities should be performed.

The VMP should state how you intend to validate systems, including which methodology and which procedures to use. Furthermore, there should be a plan for validating every relevant system in the organization based on an inventory of systems.

The contents described in the EudraLex are not necessarily required to exist in one document, because with the phrase "or equivalent documents," the regulation gives companies the freedom to document content as they choose.

Organizations then face the challenge of implementing this plan in the most efficient and operative way without invalidating their own quality systems. The danger is that organizations produce a procedural document that is not supplemented or supported by the regular procedure system, and thereby risk overlapping or contradictory procedures.

Validation approach

As outlined above, a very important part of the VMP is the validation policy or how the organization intends to approach each aspect of a validation. The terminology used can often cause discussion, and, to avoid any misunderstandings, the word "validation" is used in this article to encompass all documented testing activities related to implementing and improving systems, including all qualification activities. Definitions of the terms "validation" and "qualification" as taken from the EU and US regulations can be seen below:

The first question to look at is not, however, "how to do it," but rather, "what are we looking at?" In many cases, validation ends up being a paperwork exercise because the scope, requirements, and approach to validation are not adequately defined up front. So, what are we looking at? The important thing is to ensure that business processes run consistently, produce the desired output, and that we are able to trust the consistency of that output. One way of illustrating that is by applying a process model to validation thinking (see Figure 2).

Figure 2: An illustration of some of the factors and subprocesses that influence the overall manufacturing process in a pharmaceutical organization.

Many factors and subprocesses influence the overall manufacturing process in an organization. The goal must be to have an overall process that is validated. No one is able to "eat this elephant" in one bite, and the only way to get this overall process validated is by validating every subprocess and the factors influencing them. This is illustrated in Figure 3.

Figure 3: An illustration of a subprocess and the factors influencing it. The gray square designates a computerized system.

Sometimes a computer can be one part of the factors influencing a given subprocess. Then the system supporting the process is considered a computerized system.

Computers can be qualified separately, and that is what is done when qualifying the platform, hardware, and installing the software, but it does not make sense to claim you have a validated computer system without looking at the process the system supports. How can you prove that the system will give the expected result when used in the process if the process has not been taken into consideration?

The process approach makes it possible to take a risk-based approach to validation. You may be able to cut down on validation efforts in areas less significant to the process.

The risk-based approach is not an easy path to take, and it requires considerable effort up front. It requires a very thorough knowledge of the processes supported by the system, understanding of how different factors influence the process, and familiarity with the behavior of the system during the process.

Learning about systems like this will, in the end, be a valuable experience for the organization. Using this information well is the essence of quality management and will lead to a better quality process. It will open the organization up to continuous process and system improvements, which, in the end, will lead to reduced downtime, and thus will save money. This optimization does not happen overnight, and it must be documented. The trick is to document wisely.

The implementation of systems must be done according to a lifecycle approach, taking the entire lifecycle of the system into consideration. If not, you may find yourself with a system that works beautifully at installation, but ends up killing the process. In other words, systems and processes change over time, and system implementation should be prepared for change.

There are many approaches to lifecycle models. One of the most well known is the 5-model proposed by the GAMP forum (11). The GAMP 5-model, in itself, does not fully meet the model for the entire lifecycle of a computerized system. A modification of the GAMP model, however, can solve this problem as can be seen in Figure 4, where the model reflects the project-like nature of implementation and the cyclical nature of change handling.

Figure 4: A lifecycle model based on the GAMP 4 model. This chart takes the full life-cycle of the system into consideration. It can be seen as a flattened V-model in which the change-control loop is illustrated and can lead to varying levels of requalification. The figure also illustrates that the organization´s approach to the entire lifecycle should be reflected in the VMP. The steps of process development and process validation have been included to emphasize that these steps should be taken into consideration in the VMP. Although usually they are not part of a specific system or equipment implementation, they usually do form, however, the reason for acquiring the system.

Advertisement

The VMP must reflect the organization's approach to all the steps of the lifecycle, including change control. Procedures should be in place for the documentation of every step as well as the documentation describing how the different steps are aligned and work together. This document also should include who, in general, is responsible for each step. The lifecycle approach is not specific to computerized systems and may be applied to any kind of system.

The model in Figure 4 can be seen to include a timeline, but it does not reflect that a validation of, for instance a laboratory system, consists of several steps where some things should be considered separately whereas other things can be done jointly with great benefit.

In the case of a laboratory analyzer, the hardware can usually be tested partly by using the software and vice versa, so it does not make sense to look at the software as something separate and do the same testing twice. The timeline illustration in Figure 5 shows how the different steps can be combined to reduce the validation effort.

Figure 5: A validation project timeline illustrating how the various steps overlap and can be combined with benefit.

A uniform way of handling documentation is beneficial for an organization, and thus documentation format and approach to the various steps in the lifecycle should be addressed.

This may seem like a waste of effort to some, because the documentation can be read no matter what format it is in, but a uniform approach to documentation makes future use of the validation documentation much easier. Using this type of formatting, one can always find a specific piece of information in the same place in the documents. Furthermore, the uniform format will make it easier to ensure that all items have been covered, and review will be made significantly easier once the organization has grown accustomed to the uniform format.

The documentation should be structured, and the hierarchical structure of the documentation should be described in the VMP. All that has been designed or foreseen in the validation plan for a specific project should be answered and discussed in the validation report. A similar structure should be found on all levels of the documentation hierarchy. The expected flow of documents should follow this hierarchy through the different levels of detail, as illustrated in Figure 6, where the highest level of detail is naturally observed during the execution of validation activities.

Figure 6: Document flow and hierarchy of documents. Documents generated before validation are always directly referred to after validation in a document with a similar level of detail. The validation master plan is the top document and provides a framework for all the other documents.

How much do we really need to do?

Once the approach to validation has been established, the next big question comes up: how much do we really need to do? The answer is, on the one hand, fairly easy: "enough!" But that answer is not very helpful when you have an organization full of systems that all need validating. The more helpful answer would be "that depends," as any consultant would say. This hits the core of the matter in a risk-based approach.

Systems must be assessed in different ways to estimate how critical they are to the organization and the processes and to provide prioritization of the different systems, because everything cannot be done at once. The VMP should describe the different assessment methods that an organization uses to evaluate risk related to systems and to prioritize those systems. The most important of these assessment methods commonly used in industry include:

  • GXP (Good Manufacturing Practice, Good Clinical Practice, Good Laboratory Practice) risk assessment

  • 21 Code of Federal Regulations (CFR) Part 11 assessment

  • GAMP SW/HW Category

  • Business criticality

  • Vendor assessment

  • Gap analysis for existing systems

Each of these will be discussed in more detail below. The important point here is that they all help in giving an impression of how complex and critical a system or part of a system may be. This information should be used to determine how much effort and attention should be given to a specific system.

There is no fixed line to be drawn regarding required documents, but as long as decisions to document systems to a certain level can be justified, the organization has come a long way.

One prerequisite for doing any of these assessments is the identification of a system owner. The system owner is the person with the regulatory responsibility for the process or system and is the person responsible for keeping the system in compliance with regulations.

System assessment

GXP assessment. The GXP risk assessment is used to determine to what extent the system supports a process that is GXP-critical. In other words, the assessment should give a score to all systems that are GXP-relevant and rate them according to criticality or risk. The assessment should be semiquantitative, giving a rough guideline on how important the system is from a GXP standpoint. The following topics should be handled:

  • What criteria impact upon product quality?

  • What risk does this pose to the patient?

  • How does the use of computer systems affect this (data integrity)?

  • Can the system be affected directly?

  • Can the system be affected indirectly?

  • What would be the impact if the system failed?

The GAMP guidance (11) gives a good description of how to do this in its Appendix M3.

21 CFR Part 11 assessment. Once the level of GXP risk has been established, the applicability of 21 CFR Part 11 should be considered. Given the final guidance from the FDA on the scope and application of Part 11 (8), the evaluation has become more straightforward than when the rule itself (6) was first published. First of all, companies must establish whether the controls for only e-records, or for both e-records and e-signatures, or indeed none of these, should be applied. The approach is seen illustrated by the decision tree in Figure 7.

Figure 7: A decision tree used to evaluate the applicability of 21 CFR Part 11.

A similar approach is in order for compliance to EudraLex Annex 11 (2), which basically requires the same as 21 CFR Part 11.

If the outcome of the decision tree ends with a need to comply with 21 CFR Part 11, then the system should be assessed in detail with regard to the controls given in the rule. It should be noted, however, that not all Part 11 controls are needed for all systems.

Focus should be maintained on keeping records secure and trustworthy. Part 11 puts forth some very good tools to do this in many cases, but not in all. So, care should be taken to find solutions that fit the specific purpose of the system and use of the records and signatures. Ultimately, an organization wants to be able to trust the electronic records they use. This should be included as a requirement for the system and, therefore, be proven during validation regardless of whether Part 11 is applicable or not.

GAMP category. In order to evaluate the complexity of any system, it is advisable to use the categories outlined by the GAMP guide (11). The division of systems or software into the five categories gives a good understanding of how complex the software is and, thereby, how much effort should be put into the testing done on the user side. From a system owner or user standpoint, a given system should always be tested in detail, but, depending on the complexity and on the amount of testing done and documented by the vendor, the user will do a variable amount of testing.

For low-complexity standard systems in GAMP category three, only critical functions used will be tested and a good deal of documentation can be leveraged from the vendor. Whereas, for highly complex and custom-made category-five systems, the user will be required to do much more in-depth testing on site to be able to prove the reliability of the system.

The GAMP categories merely define some guiding boundaries and should not be used without room for interpretation based on a justified risk assessment. Most systems can be difficult to categorize clearly, and the boundaries can be hard to draw up clearly. So, care should be taken to justify the approach in every case.

The GAMP guidance also has two categories for hardware: one for standard hardware and one for customized hardware. In most cases, systems will consist of a host of standard hardware components, but it is something that should be considered when looking at the complexity of a system.

Business criticality. Just as the GXP risk has an influence on how much effort should be given to a specific system, so the business criticality should have an immense impact on the level of validation testing. Although GXP requirements for validation often regulate the need for validation, the business criticality assessment should actually rule the need for stable and trustworthy systems. If, for instance, a system is crucial to the existence of the organization, the organization had better make sure that the system is trustworthy and actually leads to the expected result, regardless of whether this system is a GXP system or not.

One could argue that the GXP risk is only one of the factors that can contribute to the overall business criticality. Devising a semiquantitative way of measuring business criticality will help in making the decision for the level of validation, but it will also help in prioritizing systems when it comes to both planning validation and planning disaster recovery. The higher the business criticality, the higher will be the need for getting the system up and running again and the higher will be the need for having a solid business continuity plan in case of disaster.

Vendor assessment. The vendor of a specific system has a great impact on the quality and the trustworthiness of the system. Therefore, the ability of the vendor to provide the required service or system should always be assessed. For GAMP categories four and five, this is usually done through a formal audit of the vendor, but there are many other ways of assessing a vendor.

Vendors supplying systems in GAMP category three, for instance, are usually well known suppliers, and their systems have been supplied in large numbers, which means that major bugs have usually been found. The reputation of the vendor in these cases can, to some extent, serve as part of a vendor assessment. Systems should not be bought from vendors who are not trusted by the rest of industry, unless a thorough audit or some other assessment shows that the vendor is capable and has an adequate quality system to support the development process. In order for industry to implement trustworthy systems, those systems must come from trustworthy vendors and developers.

Another perspective in vendor assessment is the possibility to leverage test documentation from the vendor. The vendor should do in-depth testing of the systems it supplies, and the system owner should ensure that the system has been adequately tested: either at the user site, or at the vendor site, or a combination of the two. When documentation of vendor testing is available, either through audit or in other ways, this documentation can be leveraged and used to cut down on the testing otherwise needed by the system owner, including any relevant Part 11 issues, to prove the trustworthiness of the system. There is no need to do things more than once if they are well documented.

Gap analysis for existing systems. When estimating the need for validating existing systems, it is important to make an assessment of their existing level of compliance, also called a gap analysis. Most organizations have systems that may not be perfectly up to current standards. These systems should be brought up to current standards, and the assessment of their level of compliance will give an indication of the degree to which they should be prioritized. However, things that have been tested before need not necessarily be tested again. Most important, though, is an evaluation of whether the systems actually do comply with the predicate rules or if replacements should be taken into consideration.

Reducing the workload. No one wants to do any work they do not have to do. From a business point of view, this makes sense because redundant work is a waste of money. Validation efforts should be focused where they really matter. Assessments can be a big help in getting efforts focused to reduce the workload.

Planning should be used to group systems so that any synergies from shared knowledge can be utilized. Testing of the systems should be done in a way that minimizes redundancy in testing. Focus should be on the critical parts of the systems and only they should be tested thoroughly. Less critical parts of the system may be tested less rigorously.

Documentation should be leveraged from the vendor as much as possible. The possibility of doing so depends on the result of the vendor assessment. Ultimately, the system owner must ensure that the system fulfills all requirements for a stable and trustworthy system. The system owner must be able to document the fulfillment of these requirements either by documentation produced in house, by that documentation provided by the vendor, or by a combination of the two.

Putting together the validation master plan

As mentioned earlier, the VMP need not be only one document. It is quite a challenge to put so much information into one document without breaking the boundaries of the documentation and procedure systems already existing in companies today. The information required as part of the VMP can be divided into two parts—a procedural part and a system inventory status and planning part.

Figure 8 illustrates a document structure that could be used to achieve the appropriate level of information without bypassing existing procedure and documentation systems.

Figure 8: The conceptual structure of the documents relating to the validation master plan (VMP) at various levels and the hierarchical structure and dependencies of the procedures and validation plan against the VMP.

The VMP, as a document, should be seen as the framework for validation. It should be a very high level procedural document referring to organization policies, guidelines, and procedures, where applicable, instead of merely repeating the information given elsewhere.

The VMP can be split into several documents reflecting the organizational structure. In some cases, it may be relevant to have the VMP split in a corporate tier, a site tier, and a department tier, but any relevant approach can be used. All these documents must help in providing the same framework for validation activities.

The patchwork approach. One approach to a VMP structure for the organization could be the patchwork approach, as outlined in Figure 9. In the patchwork approach, several types of VMPs are combined to give an overall picture. This is an advantage for organizations where there are big differences in the approaches taken to validate, for instance: computerized systems, general laboratory systems, and production systems.

Figure 9: The patchwork approach to the validation master plan, in which the VMPs refer to procedural documents for regular validation activities and computer system validation (CSV) activities, respectively.

When system owners receive from and give input to several VMPs, it can be a source of confusion, because systems can be in more than one category. Following multiple VMPs may also become a source for redundant information and be difficult to keep updated at all times.

The all-inclusive approach. One way of overcoming the confusing element of having several VMPs is to combine the different types of systems into one VMP. This is called the all-inclusive approach as illustrated in Figure 10.

Figure 10: The all-inclusive approach, in which the VMP refers to all procedures for validation activities.

Here, the validation approach for each different kind of system is outlined, and references can be found to the different procedures covering the varied validation areas. This approach requires that the organization have a similar approach for each kind of system and that the documentation systems used be the same. The all-inclusive approach is typically much easier to use for system owners because they will have only one source to keep updated and only one framework to work within. Similarly, this approach makes it easier for management to plan all activities in one workflow. The level of detail, however, can be split in several layers, depending on the organizational structure of management.

Using database solutions. An inventory of all systems, including the validation status, identification of system owner, and priority rating, must be generated and maintained as part of the VMP. When starting work on a VMP, the inventory or status list is the starting point. It gives the organization a basis upon which to build the VMP and prioritize their systems. Once the status of all systems is known, the needed validation activities can be planned.

There are many ways of producing this system inventory or status list, but because many organizations still work with paper as the main documentation medium, a solution generating a paper list to attach to the procedural part of the VMP should be considered. Keep in mind that it can be quite cumbersome to maintain an extensive paper list of all systems within an organization. In most organizations, a system inventory is often used to create an overview of maintenance activities and similar tasks, and such database solutions can be used, in most cases, to provide the required information for the VMP inventory or a status list.

Databases provide strong tools to sort information and to provide an overview of the required data. In order to utilize the information in the database directly, however, a fully validated and compliant database system must be used.

Figure 11: Using a database to provide input into the validation master plan.

One way of lessening the validation requirements for the database solution could be to use the database system merely as an administrative system that produces a paper record, which is then approved as part of the VMP. The database must still be validated in order for the organization to trust the output and use the database for other purposes such as maintenance activities and the like.

Consideration should be given to the fact that the workflow surrounding the VMP can be made considerably easier if the database information can be used directly as electronic records and the VMP simply refers to the database for the always-current and approved version of the inventory.

The database must be updated to produce the inventory at given intervals, and approvals of any changes to the information by electronic signatures are a small additional effort. Furthermore, being able to refer to the electronic version cuts down the need for updating the VMP document because the procedural part of it is not often changed.

The real life scenario. Organizations that do not already have a VMP in place should start off by completing an inventory. This shows what compliance status the organization is in, and it provides a solid base on which to build the plan. The status inventory is a high-level gap analysis in itself. All the systems should be assessed by the different assessment methods listed above to provide an overview of the risk they bring to the organization and the organization's processes and what gaps they have that should be filled. Then the systems should be prioritized in order to take critical systems first. Some kind of combined value produced as a sum from the different assessment methods can be very useful as a guidance. Focus should be on the controlled processes, product quality, and patient safety.

Where can we get the best return when focusing our validation efforts? Critical systems should be taken first! New systems should always be validated at implementation and, therefore, all else being equal, new systems should get priority over legacy or older systems. After the critical systems, the less critical but easy-to-solve systems should be taken. These will bring you much compliance and benefit for the dollar, and so the prioritization continues until all systems have been prioritized.

The next step is to plan specific actions. Management should take a look at the calendar and the available staff and see when activities can be performed in the prioritized order. It is very important that deadlines be set for all systems. It is even more important that the deadlines be realistic!

FDA and other authorities expect companies to show progress according to the planned deadlines. On the other hand, the milestones should reflect the risk associated with the system. For instance, it is not acceptable to have an extended deadline for a system that can pose a great risk to patient safety.

Figure 12: Conceptual illustration of the ongoing process of keeping the validation master plan updated.

Deadlines should be reasonable and reachable. Once the planning is in place, the plan should be kept updated as the work progresses. Any new systems should be included in the plan. This means that the VMP, especially the inventory and planning elements, is a living document that must be updated regularly. It also means that the planned deadlines might change as the world around the organization changes.

Keeping the validation master plan up-to-date

Every activity related to validation in an organization should be in accordance with what is outlined by the VMP. All activities should be planned by the VMP progress, according to the milestones in the VMP. These should be reported to management so that information required for running the business is always available.

The VMP is primarily a management tool. In order to function effectively, it should be kept updated and should reflect the status of systems at all times. The VMP could be said to have a lifecycle of its own, as illustrated in Figure 12.

Whenever a new system is planned or implemented, it should be included in the VMP. The appropriate validation should be performed according to a validation plan and protocols. After successful validation, the VMP should be updated to reflect the validated status of the system.

Whenever a system is decommissioned, the VMP should be updated accordingly. Whenever changes to systems or processes are implemented, the change control system should ensure that the VMP is updated, as needed, to reflect the change and ensure that the appropriate validation is performed.

Management needs this overview to plan and schedule daily business and validation activities together. The VMP should reflect what is important to the organization, based on the system assessments mentioned earlier. So, if the organization's profile changes, and another product line suddenly becomes the focus area for the business, then the VMP should reflect this change, and systems should be reprioritized accordingly.

Conclusion

Validation activities should be planned according to the regulations, and this should be done on several levels. It is important for management to maintain an overview of the validation activities in order to plan these and other activities in the organization. The VMP provides a tool for management to get this overview and to set the framework within which validation activities should be performed and planned in more detail.

There are many ways of putting together a VMP. The approach suggested as preferable by this article is an all-inclusive approach where one VMP covers all the different types of systems in an organization. The ability to rely on system performance will allow management to plan business activities while making compliance a business benefit by applying documented risk management in using the VMP.

Michael Schousboe works on the Quality Support Validation team at Novo Nordisk A/S, tel. +45 2232 7336, michael@schousboe.com

References

1. European Agency for the Evaluation of Medicinal Products, EudraLex Volume 4—Medicinal Products for Human and Veterinary Use: Good Manufacturing Practice (EMEA, London, UK, Oct. 2003).

2. EMEA, EudraLex Volume 4—Good Manufacturing Practice, Medicinal Products for Human and Veterinary Use: Annex 11—Computerized Systems" (EMEA, London, UK, Oct. 2003).

3. EMEA, EudraLex Volume 4—Good Manufacturing Practice, Medicinal Products for Human and Veterinary Use: Annex 15—Qualification and Validation (EMEA, London, UK, Oct. 2003).

4. US Food and Drug Administration, 21 CFR 211—Current Good Manufacturing Practice for Finished Pharmaceuticals (FDA, Rockville, MD, Apr. 2005).

5. FDA, 21 CFR 820.3—Quality System Regulation, Definitions (FDA, Rockville, MD, Apr. 2005).

6. FDA, 21 CFR Part 11—Electronic Records and Electronic Signatures (FDA, Rockville, MD, Apr. 2005).

7. FDA, CPG7132c.08—Process Validation Requirements for Drug Products and Active Pharmaceutical Ingredients Subject to Pre-Market Approval (FDA, Rockville, MD, Mar. 2004).

8. FDA, Guidance for Industry. Part 11, Electronic Records; Electronic Signatures—Scope and Application (FDA, Rockville, MD, Aug. 2003).

9. FDA, Guideline on General Principles of Process Validation (FDA, Rockville, MD, May 1987).

10. A. Gamal, "Validation Master Planning: A Practical Guide for Development," J. Valid. Technol. 5 (2), 118–121 (1999).

11. "GAMP Good Automated Manufacturing Practice, Guide for Validation of Automated Systems in Pharmaceutical Manufacture," Version 3, March 1998, Version 4, December 2001.

12. R. W. Koops, "Process Validation of Synthetic Chemical Processes for the Production of Active Pharmaceutical Ingredients (APIs)," J. Valid. Technol. 8 (2), 102–114 (2002).

13. M. Kropp, "Computer and Software Validation Lifecycle Planning," J. Valid. Technol. 9 (4), 298–311(2003).

14. D. Morusca and M. Cupryk, "A Compliant Distributed Control System—A Framework to Manage Documentation Expectations," J. Valid. Technol. 9 (4), 319–337 (2003).

15. B. Mullendore, Technical Guide: Computer Validation Master Planning, (Institute of Validation Technology, Royal Palm Beach, FL, 1999).

16. Pharmaceutical Inspection Convention and the Pharmaceutical Inspection Cooperation Scheme (PIC/S), Good Practices for Computerized Systems in Regulated "GXP" Environments (PIC/S, Geneva, Switzerland, July 1, 2004), www.picscheme.org/index.htm, > Publications > Recommendations (accessed Oct. 19, 2005).

17. PIC/S, Recommendations on Validation Master Plan Installation and Operational Qualification Nonsterile Process Validation Cleaning Validation. (PIC/S, Geneva, Switzerland, July 1, 2004), www.picscheme.org/index.htm, > Publications > Recommendations (accessed Oct. 19, 2005).

18. D.S. Tracy, and R.A. Nash, "A Validation Approach for Laboratory Information Managements Systems," J. Valid. Technol. 9 (1), 6–14 (2002).