The Validation and Implementation of a Chromatography Data System

December 1, 2002
Marian Mutch

Nichola Stevens

Jayne Bradley

Terry Thompson

Pharmaceutical Technology Europe

Pharmaceutical Technology Europe, Pharmaceutical Technology Europe-12-01-2002, Volume 14, Issue 12

This article describes the approach used to upgrade a chromatography data system. The upgrade was required to meet the current regulatory requirements for good laboratory and good manufacturing practice. Compliance with 21 CFR Part 11 for electronic records and electronic signatures was also a major consideration. The procedures used for this project followed the software development life cycle (SDLC) and involved the co-ordination of personnel from the software vendor, the user department, information technology and quality assurance.

The Atlas 2000 chromatography data system (CDS), developed and supplied by Thermo LabSystems (Altrincham, UK), was implemented at Covance as an upgrade to the current CDS, Multichrom Version 2.0. Multichrom runs on a VAX minicomputer cluster using VMS as the operating system and, although it has proved a robust and reliable CDS for several years, it was thought necessary to upgrade the software. Multichrom was built using 'old technology' that was not adequate to meet the current and future expectations of Covance, and its validation was considered inadequate to meet modern standards. Also, because of the introduction of the US Food and Drug Administration (FDA) regulation 21 CFR Part 11 for electronic records and electronic signatures in August 1997,1 compliance with this regulation became a priority.

The introduction of a major new system into the company presented logistical challenges - the system needed to be implemented in a compliant manner following software development life cycle (SDLC) principles, whilst ensuring seamless integration into the operational areas to minimize the effect on ongoing projects (Figure 1). To deal with these issues, it was necessary to identify key personnel from across the site, to get high-level commitment to the entire validation process and to plan the validation by implementing in a phased approach by department.

Phase 1 of the project involved implementation into the Pharmaceutical Analysis Department, which included validation of the major functionality of the application. This phase encompassed the installation qualification (IQ), operational qualification (OQ) and performance qualification (PQ). The documentation for the IQ and OQ was produced by the software supplier in conjunction with the company's own Atlas validation team.

Figure 1: Overview of the software development life cycle.

As Phase 1 of the validation was progressing, the company introduced its own system life cycle, the Covance System Life Cycle, (CSLC). Phase 2 of the project involved the roll-out of the software into the remaining five user departments and was achieved using the CSLC templates. The CSLC was an adaptation of the traditional SDLC, which met the specific requirements of Covance. Phase 2 of the project also served to capture the validation of additional functionality not addressed in Phase 1. Phase 3, which is ongoing at the time of writing this article, is involved with validating the migration of data from Multichrom into Atlas and the formal retirement of the Multichrom system.

Life cycle approach

Validation, implementation and management of computer systems using SDLC principles has been well documented.2-7 This approach has become standard practice for the validation of computerized systems for use in the pharmaceutical industry. Covance began to implement this life cycle approach with the validation and implementation of the Analyst software (PE Sciex; Applied Biosystems, Foster City, California, USA),7 using QualifyPlus (Version 3), a generic system of documentation specifically designed for the validation of CDSs.8 The project for the qualification and implementation of Atlas was based on the above validation. Figure 2 provides an overview of the overall validation process.

Identifying the key players and planning the project

The implementation project involved personnel from the vendor working alongside personnel from various departments at the implementing company. These were the user department (Pharmaceutical Analysis), information technology (IT) and quality assurance (QA). The demands of the business meant that it was necessary to have this application installed, validated and running in the live environment to meet tight deadlines but also to have minimal impact upon the progress of current studies. For these reasons it was necessary to ensure that management and all the key players were fully committed to the project and that resources would be available when required to complete the project on schedule. Careful planning was therefore essential and use was made of project plans. The roles and responsibilities of the key personnel are highlighted in Table I.

Figure 2: Validation process overview.


The starting point for all vendor supplied software management projects involves the assessment of the available systems. This process involves the assessment of the application against the user requirements. This may include a vendor audit, which encompasses IT and regulatory requirements. With the advent of 21 CFR Part 11, this became a prerequisite before a final decision could be made.

Requirements gathering and system evaluation

For the replacement of Multichrom, three systems were evaluated: Atlas (Thermo LabSystems), Millennium32 (Waters Corporation, Milford, Massachusetts, USA) and Chromeleon (Dionex Corporation, Sunnyvale, California, USA). Supplier assessment questionnaires were sent out to each of the vendors as part of this process. The responses to the supplier assessment questionnaires were retained as part of the SDLC documentation.

In conjunction with the responses to the supplier assessment questionnaires, evaluations of each CDS were conducted. Each CDS was made available on-site for a period of a week. This meant that users were able to obtain hands-on experience of the application. Each application was objectively assessed by the users. This was done by completing a checklist, and scores were allocated for the following criteria:

  • general appearance

  • ease of use

  • amount of training required

  • ease of implementation

  • instrument control

  • handling of mass spectrometry data

  • electronic record and signature implementation (21 CFR Part 11)

  • connection to a laboratory information management system (LIMS)

  • exporting data

  • compatibility of data produced with Multichrom data.

The system user evaluations did not identify a clear 'winner.' The final decision was based on business criteria. These included compatibility with existing hardware and minimum potential disruption to the operating department during roll-out. Because of this, Atlas became the CDS of choice. The cost of implementation was low because the existing infrastructure hardware used for Multichrom could be used, as the software was compatible with Multichrom and therefore data migration would be simpler. Another contributory factor in favour of Atlas was the experience that Covance has had with the vendor for many years regarding CDS and LIMS products.

User acceptance testing (IQ/OQ)

After the choice and purchase of Atlas as the required CDS, the key members of the validation team received training from Thermo LabSystems in the use of the software. For the user acceptance testing, a development environment was set up with access restricted to the validation team. The development environment was used to assess the functionality of the application and to allow for the preparation of the test logs. The application was then installed into an environment in which IQ/OQ/PQ and test logs were executed.

Table I: The key participants and responsibilities.

An overall validation plan was generated that outlined the activities to be completed during the user acceptance phase. This included an introduction to what the project was aiming to achieve, the validation scope, definitions, system overview, roles and responsibilities of the project team, validation strategy and methodology, reporting and administrative procedures.

The operating environment comprised a Dell Poweredge Server 4400. The server uses two Pentium III 866 processors and has 1 GB memory, five 18 GB disk drives (with Raid 5 Stripping with Parity), 4 GB memory for the system drive and 65 GB memory for the Atlas system and data. The system is supported by an uninterrupted power supply (UPS 2200) and is located in a secure computer room, with access restricted to authorized personnel.

The operating system is Windows NT Version 4.0, service pack 6a and the following co-installed software is present - Norton Antivirus software, Internet Explorer 5.5 and Netware Client Version 4.80. Arcserve open file agent is installed to enable the backup of open files.

Table II: Tests for functionality.

Installation and operational qualification was done by Thermo LabSystems personnel who were also responsible for the IQ and some OQ procedures and documentation. The IQ phase was concerned with formal assurance that the Atlas system (including software, hardware and Chromservers - the chromatography data acquisition device for Atlas) had been installed correctly. The activities undertaken during this phase were fully documented and comprised

  • documenting the hardware (servers and clients)

  • documenting the Chromservers

  • executing the IQ tests

The IQ testing of the Atlas server and clients was successfully completed before the OQ and user acceptance testing (PQ) was started.

The OQ phase ensures that the installed system works as specified by the vendor and sufficient documentary evidence exists to support this. This phase was mainly undertaken by Thermo LabSystems using the Atlas Validation Toolkit. The activities undertaken were fully recorded according to vendor documentation. In addition to the standard OQ tests, Covance ran separate OQ tests to check the capacity of the Chromservers, and the ability of Atlas to divert acquisition to another server (the Pick Up Server). This was required because in the event of there being a failure of the live Atlas server, data recovery is essential. The ability to cope with samples containing a number of named analytes within a workbook, containing a large number of injections, known as capacity testing, was also performed as part of the OQ.

Table III: Tests for instruments.

All the documentation produced by the vendor during the IQ and OQ phases was reviewed by a user representative and the IT group. It was also fully audited by the QA department.

User acceptance testing (PQ)

The user acceptance testing is concerned with the assurance that the functionalities specified in the user requirements specification (URS) are available and operate as specified by the vendor. Table II summarizes the test scripts produced for this phase of the validation.

Instrument tests

In addition to the above functionality testing, qualification test logs were produced and executed for the individual instruments that would be operated in conjunction with Atlas. The instrument qualification was executed using a holistic approach. The test logs were designed to test that communication between the instruments, the Chromservers and Atlas could be established and maintained. Tables III and IV summarize the tests performed.

Table IV: Individual instruments tested for use with the CDS.


Upon successful completion of the validation, release certificates were produced. The results of the validation effort at Covance, including the IQ/OQ results, were summarized in the Summary Report for Phase 1. This concluded that the application was found to meet the user requirements and was therefore fit for its intended purpose. Subsequently, Atlas was implemented into the Pharmaceutical Analysis Department. This implementation involved the following steps:

  • installation of the Atlas Client on each of the user PCs with IQ testing for this step and complete documentation

  • setting up the individual user accounts

  • production of the relevant standard operating procedures (SOPs)

  • training of the users

Validation documentation

A whole suite of validation documentation was generated by this project. It was prepared by the user representatives, IT or the vendor. The documents were reviewed by QA to ensure that regulatory and procedural requirements had been met. The complete SDLC validation package comprised:

  • a URS

  • acquisition documentation: quotation, capital justification and special purchase requisition

  • product documentation as deemed necessary

  • vendor release documentation

  • a completed supplier assessment questionnaire

  • a validation plan

  • IQ documentation

  • OQ documentation

  • additional OQ test scripts

  • PQ

  • a test plan (user acceptance plan)

  • acceptance test logs

  • a URS traceability matrix

  • a validation summary report

  • user manuals

  • training material

  • SOPs.


This article illustrates the procedures involved in the successful validation and implementation of a chromatography data system as a replacement for an old system. The project involved the use of SDLC principles and the processes were documented at each stage. Resources from various departments across the company were required and, along with the tight deadlines imposed, careful planning and teamwork were essential from the start to achieve a successful outcome.

The phased approach adopted for this validation meant that the system hardware could be fully qualified and the software validated for the functionalities required for the Pharmaceutical Analysis Department. In reality, these included the major components of the application and allowed for the departmental-specific functions to be validated in Phase 2. This phased approach also meant that the implementation into the first user department could be achieved on schedule and with minimum disruption to project timelines. This made the subsequent deployment into the rest of the site a shorter and more manageable project.


The authors would like to thank the many users involved in the validation and implementation of the CDS and Thermo LabSystems' Pathfinder services group for the installation and operational qualification documentation. The authors would also like to thank Bob McDowall from R D McDowall Consultancy Services for QualifyPlus documentation, on which part of this project was based.


1. "21 CFR Part 11, Electronic Records; Electronic Signatures; Final Rule," Federal Register 62(54), 13429-13466 (1997).

2. "Technical Report No. 18, Validation of Computer Related-Systems," PDA J. Pharm. Sci. Tech. 49(S1), S1-S17 (1995).

3. "Technical Report No. 33, Validation and Qualification of Computerized Laboratory Data Acquisition Systems," PDA J. Pharm. Sci. Tech. 53(4), 1-12 (1999).

4. R.D. McDowall, "Chromatography Data Systems III: Prospective Validation of a CDS," LC-GC Europe 12(9), 568-576 (1999).

5. R. Chamberlain, Computer Systems Validation for the Pharmaceutical and Medical Device Industries (Alaren Press, Inc., Libertyville, Illinois, USA, 1991).

6. "FIPS PUB 101, Guide for Lifecycle Validation, Verification, and Testing of Computer Software" (June, 1983).

7. T.D. Thompson et al., "The Prospective Validation of an MS Data System Used for Quantitative GLP Studies," LC-GC Europe 14(11) 687-694 (2001).

8. QualifyPlus Generic Chromatography Data System Validation Package, Version 3 (McDowall Consulting, Bromley, UK).