Implementing Data Integrity Compliance in a GLP Test Facility

Published on: 
, , , ,
Pharmaceutical Technology, Pharmaceutical Technology, August 2021 Issue, Volume 45, Issue 8
Pages: 30–34

Consider how to apply ALCOA+ to a building management system in non-clinical laboratories.

In a laboratory that uses a web-based building management system to run equipment, critical records from this system must be compliant with good laboratory practices (GLPs) regarding data integrity.

GLPs and data integrity

GLPs, as described by the Organization for the Economic Cooperation and Development (OECD), “promote the quality and validity of test data used for determining the safety of chemicals and chemicals products” (1) and promote the mutual acceptance of these data between the OECD member states. GLP principles “are required to be followed by test facilities carrying out studies to be submitted to national authorities for the purposes of assessment of chemicals and other uses relating to the protection of man and the environment” (1).

The GLP principles should be applied to the management and handling of test items investigated in the context of preclinical safety studies. Because GLP covers processes leading to the release of data, data integrity principles must be followed, as is discussed in OECD Series #1 (1).

OECD Series #1 was initially addressed to a set of data mainly generated and recorded on paper and magnetic support; thus, the definition of data (“Raw Data”) was general and “in principle”. The increase of data sets generated and recorded by computerized systems and automatized solutions, however, led to the need to cover new definitions and management of data for topics such as computerized system validation, e-archives, electronic records, and electronic signatures. These topics were addressed in United States 21 Code of Federal Regulations Part 11 (2), OECD GLP Series #15 (3), and OECD GLP Series #17 (4), which have become reference documents for the pharmaceutical industry regarding data integrity (see Figure 1 for a timeline).

OECD GLP Series #17 gives the proper definition of data (raw data), defined as “a measurable or descriptive attribute of a physical entity, process or event. The GLP principles define raw data as all laboratory records and documentation, including data directly entered into a computer through an automatic instrument interface, which are the results of primary observations and activities in a study and which are necessary for the reconstruction and evaluation of the report of that study” (4).

OECD is working on a draft guidance document on data integrity to make data integrity concepts more explicit and clarify issues already faced by earlier monographs (e.g., #15 and #17). This new guidance aims “to promote a risk-based approach to the management of data, which includes data risk, criticality, and lifecycle” (5), based on the key concepts contained in the ALCOA+ acronym, in which data in GLP studies should be:

  • Attributable (to the person
    and system generating data)
  • Legible and permanent
  • Contemporaneous
  • Original record (or certified
    true copy)
  • Accurate
  • + (complete, consistent,
    enduring, available).

Case study: building management system

Building management systems (BMSs) are automation systems for controlling and monitoring buildings, premises, and facilities related to fire detection, anti-intrusion, access control, ventilation, equipment (e.g., freezer, refrigerator, and incubator probes), and environmental conditions (e.g., temperature, humidity, pressure, and air changes). The BMS system can be considered to cover in its scope the first three levels of the ISA-95 automation pyramid (6). In Level 0, which is the field level (e.g., sensors, probes, actuators), the BMS captures any signals from the field, sending them to the control level, or the BMS actuates any signals coming from the control level. Level 1, the control level, contains programmable logic controllers (PLCs), which are formed by a central unit, input/output units, and a programmable unit. The PLC functions to execute the program by elaborating outputs (e.g., acting on valves) according to received inputs (e.g., temperature sensors). Level 2 is the supervisory level, with a human-machine interface. The supervisory level is based on client-server model: a main server collects data from PLCs and stores them in a database, while a web server hosts BMS web application that connects to the database and satisfies client requests. Users get access to the system from several devices (e.g., office laptops, tablets, and dedicated desktop workstations located in the labs) using a web browser; they can change settings on PLCs (e.g., setpoints) and monitor data coming from field devices. These levels communicate with each other by using predefined protocols.

The major advantages of using a web-based application, illustrated in Figure 2, are:

Advertisement
  • Minimal information technology department (IT) maintenance effort. IT will only have to ensure compatibility between the web browser and the BMS web application. Installation and upgrade of any software on client computers are avoided.
  • Personnel save time accessing the BMS from personal computers in the office or from home, connected to the company network. If a computer is stolen or damaged, data are not at risk because they are stored in the BMS database.
  • PLC maintenance is simplified using web-based diagnostics.

Data integrity compliance

The BMS’s functionalities make it suitable for the pharmaceutical industry to monitor the storage and environmental conditions of test items, test reference items, test systems, and specimens generated by the study conduct, as well as avoiding uncontrolled access to the facilities. The BMS manages critical records, which are the object of health authorities’ study-related inspections.

These data (metadata and raw data) that are part of the critical records include:

  • Regulated data automatically captured from qualified instruments (equipment and environmental conditions, and access control)
  • Study personnel manual data entries
  • Data trends (e.g., temperature, carbon dioxide concentration, relative humidity, pressure, air changes)
  • Alarms and events related to critical data
  • Reports
  • Audit trails.

How can these critical records be made compliant with regulatory requirements? To answer this question, it can be helpful to introduce the concept of data governance, described as “the sum total of arrangements to ensure that data are complete, consistent, and accurate throughout their lifecycle” (5).

Data governance control strategies should be implemented to achieve data integrity. According to a World Health Organization draft guideline (7), controls may be:

  • Technical (system functionalities and configurations)
  • Procedural (GLP process-based procedures implementations)
  • Organizational and behavioral (quality and data integrity culture promoted by test facility management).
  • When implementing the BMS in a lab, the authors used the following best practices to reflect ALCOA+ principles.

Attributable (A). This principle requires that data are attributed to the person or system that generates or modifies data, and that the data are attributable to the study.

The technical implications are that the following should be set up in the BMS:

  • Access control configuration (individual identification and password)
  • Roles and permission configuration according to job title
  • Audit trail configuration: data and actions attributable to a specific individual
  • Audit trail: users management
  • Report configuration: attributable to the person generating it
  • Data trends configuration: trends attributable to specific equipment
  • Filtering records according to the period when the study has been conducted.

BMS management procedures should include:

  • Authorization process flow to use the system approved by test facility management
  • Audit trail (periodic and for critical changes) reviews.

Legible and permanent (L). This principle requires that data are readable throughout the data lifecycle.

The technical implications are that the following should be set up in the BMS:

  • System generating human readable records
  • System generating accurate and complete copies of records in other formats
  • Configuration of automatic reports generation
  • Audit trail available and convertible in a human readable format
  • Data not overwritable
  • Audit trail tracing old/new values
  • Easily visible critical parameters via graphical configuration.

Contemporaneous (C). The expectation is that data are recorded when the work is performed. 

  • The technical implications are that the following should be set up in the BMS:
  • System clock (date, time, and time zone) locked by any possible change
  • Clock synchronization with Network Time Protocol qualified server
  • Audit trail recording correct time stamps
  • Time stamped reports.

Original (O). Original data are the first capture of information or a certified ‘true copy’.

The technical implications are that the following should be set up in the BMS:

  • Raw data and metadata not overwritable
  • Data folder configuration: not disposable records
  • Automatic database backup
  • Server snapshot. 

Procedures should cover:

  • Identification of critical records based on risk approach and data process flow definition
  • Backup and restore service level agreement with IT function and supplier
  • Disaster recovery.

Accurate (A). Accuracy requires that records are error-free and that any edits are documented.

The technical implications are that the following should be set up in the BMS:

  • Complete audit trail configuration (who, when, why, what)
  • Field level equipment calibration
  • Infrastructure qualification.

BMS management procedure should include:

  • System validation
  • Audit trail (periodic and for critical changes) reviews.

Plus (+). These principles require the presence of a complete set of data (including relevant metadata) and that data must be self-consistent (e.g., through the application of good documentation practices); are kept in a durable, permanent, maintainable form throughout the entire data life cycle; and are available and accessible for review or inspection purposes throughout the retention period. GLP archive process compliance (e.g., long term availability and readability of records) is required.

The technical implications are that the following should be set up in the BMS:

  • Automatized data archive
  • Qualified archive server.

BMS management procedure should include:

  • System periodic review
  • Data retention and archive standard operating procedures (SOPs).
  • For all ALCOA+ requirements, behavioral implications include the following:
  • Test facility management (TFM) promoting quality culture based on data integrity
  • TFM promoting investigation and analysis
  • TFM enabling visibility of errors and misconduct
  • TFM ensuring appropriate resources to ensure data governance (5)
  • Quality assurance unit conducting audits to determine GLP and data integrity compliance
  • Study personnel trained on data integrity principles
  • Study personnel trained on system-specific SOPs
  • TFM promoting risk-based approach
  • Site data integrity maturity level.

It is clear that system functionalities and technical and procedural implementations are crucial and represent key tools to achieve data integrity, but they are not enough. Because process knowledge and human factors play important roles within data governance, behavioral implications must also be considered. A risk-based approach to the system and process in scope is of paramount importance to harmonize the GLP requirements with the technical resources and constraints as well. This approach results in a dynamic relationship between the technical solutions of the system and the satisfaction of the regulation provisions and requirements. Because this relationship is dynamic, close collaboration of several functions is needed; the quality assurance unit, test facility management, study directors, and technical experts configuring the system and performing the activities on it must continuously interact to sustain compliance.

References

1. OECD, Series on Principles of Good Laboratory Practice and Compliance Monitoring Number 1 (1997).
2. CFR Part 11, Electronic Records; Electronic Signatures (1997).
3. OECD, Series on Principles of Good Laboratory Practice and Compliance Monitoring Number 15 (2007).
4. OECD, Series on Principles of Good Laboratory Practice and Compliance Monitoring Number 17 (2016).
5. OECD, Draft Advisory Document of the Working Group on Good Laboratory Practice on GLP Data Integrity(2020).
6. ISA, ISA95-01-2000, Enterprise Control System Integration (2000).
7. WHO, Draft Working Document for Comments: Guideline on Data Integrity (2019), Working Document QAS/19.819/Rev.1 (2020).

About the authors

Simone Cossari is a PhD student (University of Turin), simone.cossari@external.merckgroup.com; Erica Aloisio is data integrity specialist; Valerio Capirone is engineering and technology specialist; Raffaele Lasala is quality auditor; and Stefano Simonato is automation specialist, all at RBM S.p.A, Ivrea, Italy, an affiliate of Merck KGaA, Darmstadt, Germany.

Article Details

Pharmaceutical Technology
Vol. 45, No. 8
Aug. 2021
Pages: 30–34

Citation

When referring to this article, please cite it as S. Cossari, E. Aloisio, V. Capirone, R. Lasala, and S. Simonato, “Implementing Data Integrity Compliance in a GLP Test Facility,” Pharmaceutical Technology 45 (8) 2021.