IT Infrastructure Qualification and System Validation: IT Vendor Perspectives

Published on: 
Pharmaceutical Technology, Pharmaceutical Technology-01-02-2007, Volume 31, Issue 1

During the past decade, the pharmaceutical industry has increased its use of information technology (IT) in research and development, production, and commercialization of pharmaceutical products. IT systems must be operated and maintained within a compliance-oriented framework to minimize risks; maximize safety and security, integrity, accuracy, reliability of information; and maintain product quality, For IT vendors and service providers, meeting requirements for qualification and validation calls for substantial investments in terms of creating capability, expertise, and resources. The author discusses the implications, challenges, and solutions in managing IT infrastructure qualification and validation in an FDA-regulated environment, particularly at vendor sites.

Pharmaceutical companies are leveraging information technology (IT) to reduce the cost and time to discover, manufacture, and market their products. The industry faces significant pressures to bring low-cost, high-quality products and therapies to market despite rising development costs, including rising costs to comply with complex regulations. Remediation of IT systems as a result of nonconformances to quality and compliance requirements has significantly contributed to this cost escalation.

Outsourcing IT-related work across the value chain is proving to be an effective strategy for reducing costs. Increasing regulatory pressures are driving companies to focus more on quality and compliance, which are becoming the key business drivers for necessary investments in the right technologies. Companies are investing more in IT to improve operational efficiency, enable business innovation and transformation, and at the same time reduce compliance costs and effort. Improving efficiency and productivity thus have become the supreme drivers for outsourcing and offshoring IT work to vendors and external service providers.

As a result of investments in new tools and technologies, pharmaceutical companies are bound to see the cost of meeting regulatory compliance requirements increase. Emerging trends suggest strong investments in bioinformatics, clinical trial management systems, application development and maintenance, upgrades from legacy to modern systems, electronic regulatory submissions, sales force automation, content and document management, and scientific and technical communications. Validation and qualification thus become critical to quality requirements. Regulatory compliance then becomes a top-management propriority, and pharmaceutical companies are increasingly aligning their IT strategy with regulatory compliance needs.

Regulatory compliance: industry perspectives

To meet the challenging demands in IT, pharmaceutical companies are engaging IT vendors and service providers to benefit from offshore services and end-to-end solutions. For example, Six Sigma improves productivity, digitized processes help reduce cycle time, and outsourced network and infrastructure management and application maintenance can boost efficiency and enable business innovation. With safety and quality as the prime imperatives, companies are required to adhere to various regulations that govern the manner in which IT is used in product development and manufacturing. Good clinical practices, good laboratory practices, and current good manufacturing practices (collectively known as GxP) such as Computerized Systems used in Clinical Trials, General Principles of Software Validation, 21 CFR Part 11 Rule on Electronic Records and Signatures, Computerized Drug Processing, and CGMP Applicability to Hardware and Software are legally binding (1–4). In addition, the International Conference on Harmonization (ICH) combines the regulatory authorities of United States, Europe, and Japan in efforts to achieve greater harmonization in the interpretation and application of technical guidelines and requirements for product registration.

Positive strategies include devising best practices to avoid noncompliance in IT systems, standardization, streamlining business processes and a risk-based approach toward validation of information systems. Mature pharmaceutical companies view regulatory compliance as not only a legal requirement but also an ethical requirement and a part of good business practice. A culture of quality and continued compliance essentially ensures that risks are minimized and quality is not compromised, which in turn yields lasting business value and builds up investor loyalty and faith.

Given the complex regulatory pressures, pharmaceutical companies are expected to spend more on regulatory compliance needs in the future. Chief information officers and chief financial officers are being asked to demonstrate business value to justify IT investments. It is difficult to measure the return on investment from regulatory compliance activities because it is almost impossible to prove the consequences of noncompliance when they have been successfully averted. Nonetheless, one of the major challenges for the industry is to adopt cost-effective processes and methodologies to achieve and sustain compliance.

Challenges for IT vendors

The pharmaceutical industry has started to accelerate the outsourcing and offshoring of IT work such as application development and maintenance to vendor companies. By outsourcing, companies expect to reap substantial cost savings in business areas such as IT-enabled drug discovery; clinical data management; IT infrastructure support, application development, and management; and scientific communications. Besides benefiting from cost savings as a result of the so called labor arbitrage, such companies expect their resources to focus more on innovation-led core business operations by having IT vendors look after and worry about IT-related needs.

Pharmaceutical companies expect IT vendors to help them achieve their regulatory compliance–related goals at low costs. An ability to deliver cost-effective and efficient IT services that meet regulatory-compliance requirements is the key for vendors and external service providers to win IT outsourcing contracts from companies.

IT vendors must view regulatory compliance as a critical lifeline in successful execution, management, and delivery of IT and business solutions and services to pharmaceutical companies. Of late, pharmaceutical companies have started evaluating and selecting IT vendors on the basis of their credentials in compliance services, which include focus, proven capability, and experience in compliance areas such as managing validation, qualification, and data privacy. They consider such elements as critical differentiators when awarding IT outsourcing contracts to vendors.

IT vendors and service providers also are being exposed to regulatory compliance pressures. FDA expects IT vendors that provide services to pharmaceutical companies from vendor sites to comply with regulations and prepare for future FDA audits. The intent is to increase the monitoring of project operations at vendor sites. Therefore, compliance is a critical requirement for vendor and service provider sites.

Vendors that develop and maintain IT systems for regulated companies from their sites must be aware of their responsibilities as prescribed in FDA's Compliance Policy Guide (CPG) on Vendor Responsibility, which makes vendors "liable, under the Food, Drug and Cosmetic (FD&C) Act, for any violation attributable to intrinsic defects in the hardware and software" (1). According to the guide, "Vendors may incur liability for validation, as well as hardware/software maintenance performed on behalf of users."

IT infrastructure qualification and system validation are among the most critical requirements in regulatory IT compliance. One of the most important challenges for a vendor is to manage its IT infrastructure and systems in a qualified and validated state to meet necessary regulatory requirements for software development and maintenance at vendor sites.

Understanding applicable FDA regulations

FDA regulations, as published in documents such as Computerized Systems Used in Clinical Trials or the General Principles of Software Validation, dwell largely on software applications and do not directly mention IT infrastructure.

Title 21 of Code of Federal Regulations (CFR) Part 11 (Electronic Records and Electronic Signatures) is an umbrella regulation covering all predicate rules for good clinical practices, current good manufacturing practices, and good laboratory practices. Part 11 mentions computerized systems as well as software applications such that any computerized system in its entirety is subject to the regulation. FDA defines a computer system as: “a functional unit consisting of one or more computers and associated peripheral input and output devices, and associated software, that uses common storage for all or part of a program and also for all or part of the data necessary for the execution of the program; executes user-written or user-designated programs; performs user-designated data manipulation, including arithmetic operations and logic operations; and that can execute programs that modify themselves during their execution. A computer system may be a stand-alone unit or may consist of several interconnected units.”

A computerized system is defined as a unit that includes hardware, software, peripheral devices, personnel, and documentation such as manuals and standard operating procedures (SOPs). It is based on an infrastructure made up of data centers, servers, workstations, routers, switches, firewalls, applications, and protocols.

The 21 CFR Part 11 rule suggests that "any decision to validate computerized systems, and the extent of the validation, takes into account the impact the systems have on its ability to meet predicate rule requirements." The effect these systems may have on the accuracy, reliability, integrity, availability, and authenticity of required electronic records and signatures must be considered. Further, the rule states that "even if there is no predicate rule requirement to validate a system, in some instances it may still be important to validate the system." Although this rule does not seek to establish "legally enforceable responsibilities," the industry has witnessed a few cases in 2005 where FDA issued warnings to a regulated firm citing noncompliance to Electronic Records and Electronic Signatures (ER,ES) requirements (specifically, pointing out a lack of validation in a computer system as a grave risk).

Therefore, the entire IT infrastructure stands influenced by 21 CFR 11, thereby making it fall under the ambit of regulatory scrutiny. Qualification of IT infrastructure thus becomes essential to the validation of computerized systems. IT infrastructure houses and sustains validated systems; therefore, the sole purpose of infrastructure qualification is to guarantee and safeguard reliability, security, and business continuity. IT infrastructure, if not maintained in a demonstrable state of control and qualification, may affect the validated status of GxP applications or electronic record systems that depend on the infrastructure.

Advertisement

Understanding IT infrastructure qualification

When outsourcing work to IT vendors, one of the foremost points that pharmaceutical companies look for is whether the vendor company has facilities that are capable of meeting FDA requirements for qualification and validation. Companies expect that infrastructure, third-party, or proprietary tools used in project execution are qualified and validated to meet prescribed FDA guidelines. And vendors are asked to demonstrate evidence that they can develop, operate, and maintain IT infrastructure and systems in a validated and qualified state.

In a GxP environment, qualification of IT infrastructure and validation of applications are required. Qualification ensures that desktops, platforms, servers, networks, routers, switches, and so forth are maintained with controlled and repeatable processes. In many cases, qualification extends beyond IT infrastructure such as networks to include processes, procedures, day-to-day operations, and personnel. Nonetheless, the level of effort required to qualify IT infrastructure should be proportional to the complexity and value of the information assets it supports and the risk it poses.

FDA considers software validation to be "confirmation by examination and provision of objective evidence that software specifications conform to user needs and intended uses, and that the particular requirements implemented through software can be consistently fulfilled" (5).

In a GxP scenario, validation is the process of demonstrating that any "entity" used in the process of drug development is suitable for its intended use and performs the functions that are expected of the entity. Computer system validation provides documented evidence substantiating the performance of the entity in question.

For drug-regulated systems such as software that generates toxicology study reports, validation is mandated, and IT infrastructure that supports such a software application must be qualified. Companies make a business decision to validate systems based on the risks that are attached to systems; these are risks that can affect product quality, data integrity and reliability, information security and other critical elements in the drug development business process. Many companies validate all systems and infrastructure thinking that this is a good business practice, the right thing to do, and that generates goodwill in the industry, restores confidence, and promotes faith with the regulatory agencies and inspectors besides fostering a culture of quality. This approach can lead to a benchmark in of itself and set high standards in the industry in addition to becoming a best practice that results in long-term tangible benefits for the company and its customers.

Validation applies to applications and software, whereas qualification applies to IT infrastructure such as platforms, operating systems, networks, and so forth. The primary goal of qualification is to ensure utmost quality in performance and operations of a system. Qualification processes are required to maintain qualified infrastructure, which will accommodate the operation of validated computer systems. The qualification for IT systems and infrastructure as determined by FDA are:

  • installation qualification; that is, a system is installed and configured to specifications. This is the process of establishing documentary evidence that a respective system is installed and configured to the manufacturer's specifications.

  • operational qualification; that is, a system performs according to specifications and requirements (intended use). This is the process of establishing documentary evidence that a respective system functions and operates according to a provider's (manufacturer, developer, or vendor) design intentions and throughout its intended operating range.

Controlling processes and systems is the key element in qualification and validation. Demonstrating evidence of such control in the form of current and approved documentation that is strictly followed is the key to success in meeting evaluation and success criteria.

Nonconformance may threaten data integrity. Compromising the integrity and security of clinical, health, and drug-related data that exists on systems could lead to bad decision-making that in turn could pose a serious risk to consumer and patient safety. Moreover, a lack of infrastructure qualification may lead to risks of unreliable data, data corruption, missing system controls, no control over risk factors, an inability to base anything on results, and disqualification and rejection of a system's regulated data during an inspection.

The sidebar "Excerpts from FDA Warning Letters" provides some insight into the areas that were closely examined during site audits in 2004.

Qualification: vendors must plan and invest

Blind adherence to mere checklists, misinterpretation of regulatory requirements, and poor and inadequate documentation can lead to a failure to meet compliance objectives effectively.

The following are a few questions that prospective clients generally ask vendors in requests for information or while on due diligence visits:

  • Does your company have a defined infrastructure qualification and system validation policy? Can you develop and maintain our systems to meet validation requirements?

  • Is the infrastructure (data centers, servers, routers, switches, operating systems, databases) at the vendor site qualified?

  • Do you have SOPs for predicate rules, data privacy, change management, physical and logical security, and business continuity planning?

  • Do you have compliance audit mechanisms?

IT services companies that are ISO- or CMM Level 5–certified have well-defined and documented IT processes and procedures for software development and infrastructure management and maintenance. Their lack of tailoring procedures and guidelines for meeting GxP-critical FDA compliance requirements, however, may pose a problem. Many IT vendors do not have defined processes for computer system validation and infrastructure qualification. There could be inadequacies in computer system validation and IT infrastructure qualification documentation, and certain specific processes required for validation and qualification may not be that well enforced.

Excerpts from FDA Warning Letters

During prospect engagement discussions, vendors should be ready to answer questions about qualification and validation. They must be conscious of the importance of such investment in making their IT infrastructure qualified and systems validated, which may call for some business decisions. In the long run, such investments will yield good results for vendors in terms of repeat business from customers and ensuring competitive business advantage. It will also help vendors showcase regulatory compliance prowess to pharmaceutical clients. Chief information officers and chief financial officers of vendor companies will have to plan for such investments in IT quality, particularly for drug-regulated industries.

The ideal solution for vendors to meet compliance requirements in a cost-effective way would be to adopt a risk-based approach. This would call for a risk assessment, gap analysis, and remediation plan. The risk-based assessment must map to FDA regulations about software validation and predicate rules on GxP and 21 CFR Part 11.

Risk-based qualification of IT infrastructure

Qualification traditionally has been a time-consuming, document-intensive, and costly process for pharmaceutical companies as well as IT vendors. Of late, the pharmaceutical industry has begun to approach qualification from a risk-based perspective, with the sole objective of cutting non–value-adding processes that do no impact system reliability, quality, or data integrity in any way.

A risk-based approach to qualification and validation processes involves implementing a validation regime based on risks posed to systems. It also helps in right-sizing the level and effort required for qualification and validation purely based on risks, criticality, and potential business and regulatory impact. Risk-based qualification involves identifying, understanding, evaluating, controlling, and monitoring risks that IT infrastructure poses to software applications that support the drug development process or a business process supporting product development. Such risks associated with the IT infrastructure could directly or indirectly affect product quality, safety, data, information, business process, and so forth. Size and complexity of the IT infrastructure and influence on quality or business process are important elements in such a risk assessment. Vendor companies should publish policies about qualification that identify which methods would be undertaken to qualify IT infrastructure, facilities, and equipment such as data centers, networks, servers, platforms and desktops.

Risk-based qualification and validation help manufacturers determine the coverage, level, and effort on the basis of elements, including but not limited to:

  • the complexity of the IT infrastructure;

  • the type of the equipment and assets it supports (e.g., custom-built, off the shelf);

  • the effect and risk to applications that are involved in pharmaceutical product development;

  • the impact on business processes;

  • the impact on product quality, data and records integrity, and safety;

  • elements such as data privacy and intellectual property.

The risk-based approach emphasizes the need to focus on critical features and processes that affect product quality. The trick is to apply knowledge and experience and then base it on intended use of the infrastructure and computer systems relative to regulatory and business need. All rationale and justification for decisions must be documented and approved.

Scenarios on validation and qualification in an outsourcing environment

For example, consider a scenario in which a vendor provides IT services to a pharmaceutical company from its offshore development center. The offshore development center houses a data center, workstations, a network, servers, and so forth. For this example, let's take an example of a server that is used to store clinical and health data. This server would be an open system, that is, the access is controlled by personnel who don't own the data that the server holds. Because the server holds clinical and health-related data (which may include privacy data as well), it becomes a regulated server, which must be managed and operated according to regulatory guidelines on validation. If the facilities were to be inspected by an external agency or a regulatory authority such as FDA, the vendor would need to prove that this server is qualified—that is, it is developed to meet its 'intended use' by following the principles of software validation as mandated by FDA and that the server is being managed and operated in a controlled environment. A detailed risk assessment accompanied by relevant supporting qualification documentation would be imperative to demonstrate that the server is qualified.

Now consider a scenario in which a vendor company is planning to support applications such as clinical trial management systems, laboratory information management systems, or manufacturing control systems. In these cases, the IT infrastructure that supports these applications must be qualified. In another case, if the systems and applications supported by the IT infrastructure are nonregulated in nature— for instance, a simple payroll application that has no product quality or regulatory impact, has minimal safety implication, and poses no major risks—then the approach to qualify and validate it will be entirely different. It may not require validation at all, if there is a well-defined and approved documented rationale based on risk factors. Equipment, facilities, and processes that pose no risks to product quality will not come under regulatory scrutiny at all and would require very low effort in risk-control methods and in qualification.

Nonetheless, any exceptions and deviations from the risk-based approach must be documented and approved after performing adequate business impact and system criticality analysis. Some important elements to be considered in a typical risk assessment are listed in Table I.

Table I: Risk assessment factors.

At a corporate level, vendors can publish computer-system validation and infrastructure-qualification policies as the governing documents, which will apply to GxP-critical projects. To demonstrate technical, procedural, and administrative controls within vendor infrastructure setup, the documentation for qualification and validation should be complete and comprehensive. Typically, this documentation would include:

  • a documented and approved risk-assessment report detailing what risks the infrastructure (e.g., a server) poses to systems and applications that can directly impact drug product quality and safety and data integrity;

  • a qualification and validation plan or protocol that contains the purpose and scope, roles and responsibilities, coverage, periodic review procedures, maintenance procedures, administration procedures, documentation, and so forth;

  • network requirements, design, and testing documentation;

  • mandatory documentation, including installation SOP, physical and logical security SOP, backup and restoration SOP, disaster recovery SOP, business continuity SOP, change control SOP, and document management SOP;

  • installation qualification and operational qualification documents;

  • installation verification records that prove that components of the IT infrastructure were installed and configured properly and in accordance with specifications and procedures and that there were no deviations from standard installation;

  • operating procedures and instructions for those who run the infrastructure;

  • administration procedures;

  • training SOP or manuals;

  • qualification reports;

  • infrastructure retirement and revalidation procedures;

  • data management, record retention, and archiving procedures;

  • audit trail procedures.

A platform–operating system based approach to qualifying infrastructure ensures a higher level of standardization throughout the life cycle and minimal overlap in documentation as well.

For nonqualified infrastructure that already supports GxP applications, retrospective qualification is recommended. Retrospective qualification, however, may not always be equivalent to prospective qualification. When it's performed retrospectively, the best strategy to follow is to divide it in four phases: system inventory, gap analysis and risk assessment, remediation, and implementation.

Develop system inventory. This inventory includes a network of software and hardware with information about sites locations and list of people responsible for developing and maintaining the inventory. Identify applicable information systems that are used to create, modify, retrieve, archive, transmit, and submit data to regulatory authorities. Include documentation of system inventory.

Conduct gap analysis and risk assessment. Perform gap analysis and risk assessment on the basis of "as is" and "to be" scenarios. Review processes, procedures, and current documentation in accordance with GxP guidelines.

Have a remediation plan. Classify infrastructure systems according to business risks. Identify issues and corrective actions. Perform impact analysis and set the timelines for remediation of infrastructure systems and documentation.

Perform implementation. Implementation would cover activities such as revising procedures, establishing necessary security, administrative and procedural controls, audit trails capability, keeping documentation current, training personnel, and continued monitoring.

Qualification must not be a fixed, one-time process. The challenge is to always keep the infrastructure and computer systems in a qualified and validated state. The SOPs that govern the usage of the systems must be kept current, understood by all the stakeholders, and adhered to at all times during the system lifecycle. Periodic reviews of systems and documentation must be carried out to ensure that there are no gaps and that the systems are always in a state of compliance and fit for business use.

Once a qualification package has been approved and documented, it can be referenced in all validation packages of individual software applications. This approach will offer the benefit of not having to redo validation for diverse applications that are supported by this IT infrastructure.

The cost of retrospective qualification is usually five to six times more than that of prospective qualification. And the average cost of remediation of a single computer system can be in the thousands of dollars. A risk-based approach can prove to be a cost effective way to address compliance requirements. It optimizes expenditure and can decrease the average costs of qualification and validation significantly.

Best practices

Best practices will emerge when vendors execute projects out of their offshore development centers or vendor sites and face compliance audits. Regular quality compliance audits will help establish criteria to streamline and standardize qualification processes. Incorrect interpretation of regulatory requirements can lead to impediments to the correct approach in qualification and validation processes. One must not lose sight of FDA policy mandated for implementation as well as frequent updates to regulations and draft guidance documents. IT vendors also should observe the trends in the industry and learn how pharmaceutical companies are devising good practices in pharmaceutical engineering. Subsequently, vendors will understand what is currently working and what must be removed. Good documentation practices, if followed, will assist in implementing best practices across projects.

The frameworks for compliance are almost readily available to be leveraged, but awareness of the trends, best practices, and approach is inadequate. For most big vendors that are CMM and ISO certified, all the relevant quality processes and systems are in place. After all, FDA regulations such as Principles of Software Validation are drawn from established industry standards on software engineering. The only perceptible differences are the scope, perspective, and the increased emphasis and rigor about risk assessment, security, testing, traceability, configuration and change management, verification, audit trails, and so forth. Conformance and compliance thus become utterly non-negotiable. Nonetheless, it also is beneficial not to allow innovation to suffer in the name of regulations. Doing what the regulations prescribe is imperative, but being able to continuously innovate and cultivate best practices based on experience and knowledge will help in reducing non–value-adding effort.

Vendors should focus on developing competency in working in regulatory compliance frameworks. Awareness of industry and FDA terminology is a must. Vendors must appoint infrastructure qualification and validation consultants to lead and drive the effort in installation qualification and computer system validation. Quality groups must be sensitized to the rigors and trends in quality systems and processes in the pharmaceutical industry.

Setting up validation and qualification services teams will be fruitful. Such teams could include qualification and validation leads, information system personnel, quality control representatives, internal auditors, regulatory compliance experts, and technical writers.

Conclusion

Consequences of noncompliance can be disastrous for any pharmaceutical company. For vendors, noncompliance could impact business value. Noncompliance at vendor sites could lead to a loss of business opportunity. Vendors should be sensitive to regulatory requirements and also be flexible in aligning their information technology and quality processes to that of regulatory expectations.

Vendors must understand the expectations from the industry and regulatory bodies and have provisions within their company policies to tailor, customize, and even refine specific IT processes to address regulatory compliance requirements. Good preparedness for audits and inspections and a thorough understanding of risk-control mechanisms hold the key to success. Vendors and service providers must gear up for the future because future regulations will be more complex and strict and only compliance and quality-focused vendors will lead, in view of FDA and other regulatory agencies' continued vigil to monitor compliance in IT systems. The best way forward is to confidently assess gaps in compliance, remediate systems and documentation, and adopt a risk-based approach for compliance. A continued state of compliance in itself is a reward and the benefits are immeasurable.

For vendors to gain IT outsourcing contracts from pharmaceutical companies, they must demonstrate control of IT infrastructure and systems and produce evidence in the form of current documentation. Demonstrating evidence of qualification and validation is an important criterion in vendor evaluation and assessment. Continued investment in building compliant frameworks and regimes, in terms of managing IT Infrastructure and systems in a validated way, will help vendors sustain competitive business advantage.

IT vendors can create value for both their clients and themselves by defining a strategy for regulatory compliance, particularly for pharmaceutical IT services, aligning it with their overall IT strategy, understanding what is required, and embracing a culture of continued compliance and quality.

Siddhartha Gigoo is a manager at Tata Consultancy Services Limited (TCS) C-56, Phase 2, Dist. Gautam Budh Nagar, Noida-201305, Uttar Pradesh, India. siddhartha.gigoo@tcs.com or sgigoo@yahoo.com

*To whom all correspondence should be addressed.

Submitted: May 15, 2006. Accepted: Sept. 7, 2006

Keywords: GMP compliance, information technology, qualification, validation

References

1. US Food and Drug Administration, Compliance Policy Guides Manual, Human Drugs, Computerized Drug Processing; Vendor Responsibility, CPG 7132a.12, Sub Chapter 425, 1985.

2. FDA, Guidance for Industry, Computerized Systems Used In Clinical Trials, 1999

3. FDA, General Principles of Software Validation; Final Guidance for Industry and FDA Staff, 2002.

4. Code of Federal Regulations, Title 21, "Food and Drugs Guidance for Industry, Part 11, Electronic Records; Electronic Signatures–Scope and Application"

5. FDA, Glossary of Computerized System and Software Development Terminology, 1995.