OR WAIT 15 SECS
Siegfried Schmitt, PhD, is Vice President Technical at Parexel International, Siegfried.Schmitt@parexel.com.
An assessment can identify the critical systems and the gaps in compliance based on intended use, says Siegfried Schmitt, vice president technical, Parexel.
Q. We recently performed a detailed assessment of our computer systems for compliance with the applicable regulations, especially 21 Code of Federal Regulations Part 11 (1). The aim was to remedy critical systems first, if any gaps were found. However, almost all systems ended up in the critical category and many of these have compliance gaps. We do not have the resources to address all of these now, so how can we prioritize?
A. Your approach to perform a review and apply a risk assessment (RA) is commendable and correct. As you did not provide details about the criteria in your RA and how you weigh them, my answer has to be generic as this is a common issue when it comes to computer systems.
The principles of computer systems validation (CSV) are not dissimilar to process validation. To validate a process, you need to have qualified equipment. Similarly, for CSV, you need to have a qualified infrastructure. Once this is in place, you can then validate software for business purposes. This allows you to focus your compliance assessment on the software.
The next step is to perform a straightforward triage (see Figure 1).
Next comes the risk assessment, and in order to do this logically and compliant, we should follow regulatory guidance. FDA guidance states, “The extent of validation studies should be commensurate with the risk posed by the automated system” (2). Guidance from the Medicines and Healthcare products Regulatory Agency (MHRA) in the United Kingdom states, “Validation effort increases with complexity and risk (determined by software functionality, configuration, the opportunity for user intervention and data lifecycle considerations)” (3). The key word here is complexity. When you look through your inventory of computerized systems, you should be able to at least make a distinction between simple and complex systems, such as a Karl Fischer titrator and a chromatography data management system.
The regulations demand that your validation is for the intended use of the computerized system. The intended use should be documented or referenced in the system inventory. All too often, companies fail to document the intended (i.e., actual) use of the system. For example, an electronic quality management system may have many modules, but you may only be using the documentation management and the training module. When you assess your system for compliance, make sure that you assess these modules specifically, instead of a whole range of installed, yet not used, modules.
Now you should have been able to identify the critical systems and the gaps in compliance based on intended use. It is generally possible to categorize the remediation effort for eliminating these gaps into short-, medium-, and long-term activities. A short-term solution may be to stop using a particular instrument; a medium-term solution may be an upgrade of the software or a revalidation exercise; and a long-term measure could be the replacement of a piece of automated equipment.
Coming back to your question, you may want to review your approach to assessing your systems and verify you really understand each item’s intended use based on the above recommendations. This should leave you with a much more manageable task in hand.
1.CFR Title 21 Part 11 Electronic Records; Electronic Signatures.
2.FDA, Data Integrity and Compliance with Drug CGMP, Questions and Answers, Guidance for Industry (CDER, CBER, December 2018).
3.MHRA, ‘GXP’ Data Integrity Guidance and Definitions (MHRA, March 2018).
Siegfried Schmitt is vice president technical, Parexel.
Volume 44, Number 9
When referring to this article, please cite it as S. Schmitt, "Computer Systems Validation–a (Un-)Manageable Task?," Pharmaceutical Technology 44 (9) 2020.