Data Integrity Considerations for Vendor-Generated Data Associated with Analytical Testing

Published on: 
, , , , ,
Pharmaceutical Technology, Pharmaceutical Technology, 2021 Outsourcing Resources Supplement, Volume 2021 Supplement, Issue 3
Pages: s16-s20

Sponsors should consider best practices for maintaining data generated during sample analysis and instrument maintenance.

This article summarizes a pharmaceutical industry consensus viewpoint of the current regulations regarding data integrity as applied to analytical test data generated to support regulatory activities. In particular, the focus will be on data generated by external personnel (e.g., a third-party laboratory or instrument service technician), hereafter referred to as a vendor.

There are considerations for data integrity (1–4) when data are generated by a vendor. For example, analytical instrumentation (e.g., chromatographic, spectroscopic, spectrophotometric, thermogravimetric, electrochemical, or microscopy instrumentation) utilized in the pharmaceutical industry generates data not only during analytical sample analysis but also during routine or preventive maintenance, such as instrument calibration or qualification, and during troubleshooting associated with repairs. For preventive maintenance or repairs, data may be generated by company personnel, hereafter referred to as the sponsor, or by vendors. The authors will refer to vendor-generated data, whether generated using the sponsor’s instrumentation or the vendor’s own instrumentation, as “outsourced data”.

Although data generated during routine testing may employ well-established processes that ensure data integrity, because pharmaceutical companies deal with many vendors as well as in-house support for instrument support, the maintenance/calibration/qualification process may be fully paper based, fully paperless, or a combination of both (i.e., hybrid documentation). As such, the approaches taken to ensure data integrity during these activities may vary and should be assessed on a case-by-case basis. In addition, the possible modes of remediation may evolve as instrument manufacturers and testing/maintenance vendors evolve their approaches and capabilities.

Within this article, data are defined as electronic and/or paper records generated during GxP testing (e.g., release or stability testing) as well as during analytical instrument calibration and/or qualification activities. Outsourced (or third-party) data are data generated on qualified/validated instruments by outside personnel. These include:

  • Data generated by vendors when visiting a sponsor site and generating electronic and/or paper records that are created and saved outside of the sponsor’s network and support regulated activities (e.g., lot release, clinical stability, etc.)
  • Data that support regulated activities generated by a vendor other than at the sponsor site; these data include electronic and/or paper records
  • Data generated by secondary vendors contracted by a primary vendor, which may include electronic and/or paper records (e.g., in the case of hybrid systems).

All expectations should be clearly described in the written agreement (e.g., maintenance contract, quality agreement) in any of these situations.

Data should be attributable, legible, contemporaneously recorded, original or a true copy, and accurate (ALCOA). Data management, retention, and archiving should be managed based on documented risk, including the format of the data. If electronic, dynamic data are available, these data or a complete, certified true copy thereof should be maintained.

Dynamic data are formatted to allow for interaction between the user and the record content. With dynamic data, a user may be able to reprocess using different parameters or modify formulas/entries that will alter a calculation result. If it is not possible to maintain the electronic, dynamic data record, a complete static representation of all data including metadata, audit trails, etc., must be maintained. Static data (data that are fixed and allow little or no interaction between the user and the record content) may allow for a more streamlined approach than that required for dynamic data. For instance, because there is no need to monitor changes in data processing or in reported results, in many cases it may be appropriate for the third-party to supply only a printed or static electronic report (e.g., a PDF file) of the reported data for archive and retention. For data that were generated in static format, a static archived representation is appropriate if it is a true copy (including all relevant metadata) of the data. In all cases, the original data must be completely reconstructable.

The various scenarios under which outsourced data may be generated and managed are summarized in Table I and are further discussed in the following scenarios. In each case, key requirements and points to be considered are given. For all three scenarios, the vendor should be approved by the sponsor and there should be an agreement in place for services provided. The necessary details of the agreement vary depending on the service provided and the scenario.

Outsourced data scenarios

Scenario A: Vendor-generated data on sponsor instruments using the information technology (IT) infrastructure of the sponsor.

In cases where data are acquired and stored by the vendor through the sponsor’s standard workflows and on the sponsor’s technology infrastructure, the sponsor’s standard data integrity policies and procedures should apply. The accounts and roles utilized by the vendor should be unique and configured to ensure attributability and be specific to the work being performed. The sponsor should ensure that the vendor possesses the appropriate training required for access per sponsor’s applicable standards and retain documentation of this assurance.

Scenario B: Vendor-generated data on sponsor instruments using the IT infrastructure of the sponsor and non-standard processes.

Advertisement

This scenario pertains to situations where the instrument hardware, firmware, or software is being accessed or utilized in a non-routine manner (different than it would typically be used to collect or process data). Examples could include using diagnostic mode or calibration mode, accessing the software outside of the network, saving in a different file location, or using a hybrid paper/electronic process. A risk assessment and mitigation strategy should be considered to ensure compliance with ALCOA principles.

It is recognized that some of the previously mentioned examplesmay require non-standard access. The accounts and roles utilized by the vendor should be unique and configured to ensure attributability and be specific to the work being performed. The sponsor should ensure (through a written agreement, direct training, or other written process) that the vendor has appropriate technical and good documentation training per sponsor’s applicable standards and retain this documentation. Data review should follow the sponsor’s standard practices and may include a defined, documented risk-based approach for reviews of vendor-generated data. System level reviews should include reviews of vendor account access. Vendor-generated data on sponsor instruments should be in alignment with the sponsor’s standards for segregation of duties. Controls should be in place to ensure data integrity is appropriately managed. In particular:

  • Access needs to be traceable.
  • Specific and unique user roles, permissions, and passwords for access are required.
  • Where appropriate, access should be provided on a temporary basis and subsequently rescinded after a defined period.
  • Controls for remote access, especially by vendors, should be evaluated to determine the effectiveness of physical controls and to ensure appropriate data and system protection.
  • Data security (including the ability to delete/modify data) should be subject to appropriate controls (technical, administrative, or procedural); consider accidental loss or corruption of sponsor data.
  • Changes in the data retention path (relative to that in the qualified configuration) should be subject to change management.
  • Reviews of vendor data (including the audit trail, if applicable) should be defined based on documented risk.
  • Vendors should be granted access at the lowest level that allows the performance of necessary duties.
  • In cases where administrative access is required for the vendor to perform their duties, their ability to make critical changes (such as deleting, editing, or modifying data without detection) should be controlled through technical, administrative, or procedural means.
  • Hybrid paper/electronic processes may be captured in the vendor agreement, but in any case, should be documented and assessed for data integrity risks.
  • Deletions or modifications of data or metadata (e.g., dates) should not obscure previously recorded information and should be attributable.

Scenario C: Vendor-generated data stored on the IT infrastructure of the vendor.

One of the critical elements of data integrity for data generated by a third-party is where the raw and reported data are stored and to whom the data are accessible. Regardless, data should be retained and archived in accordance with a quality agreement (or equivalent) and in compliance with regulatory requirements. These written agreements should establish sponsor expectations and vendor responsibilities related to data integrity controls for good manufacturing practice (GMP) or good laboratory practice (GLP) records, as well as how communication and auditing of such records should take place. The agreement should ensure the following, for the vendor:

  • Vendor procedures are in place to ensure that for all data, whether paper or electronic, at a minimum the ALCOA requirements are met.
  • Vendor testing records are reviewed by the vendor to ensure compliance with all procedures, specifications, and regulatory requirements.
  • Vendor investigations and vendor internal auditing include data integrity (e.g., calculations, quality of procedures/processes to uncover data integrity issues, documented training with a focus on data integrity and responsibilities for the vendor).

The sponsor should agree to regularly update, review, and communicate the following to the vendor:

  • Any updates to procedures reflecting data integrity practices providing a quality environment between sponsor and vendor
  • Audit/investigation results and reviews of vendor data integrity quality metrics.

The sponsor should clearly communicate that sponsor audits of the vendor will include a focus on data integrity elements and practices, and the sponsor should ensure that audits include assessment and evaluation of data integrity controls in place.

The written agreement should additionally delineate the record retention responsibilities of the two parties and any handoffs between vendor and sponsor at specific milestones. In cases where data are collected using software that the sponsor does not have, the vendor should retain e-records and the software necessary to make them human-readable (including metadata).

A primary concern is that of security of the raw data. Data stored on the IT infrastructure of a third party are inherently less under the control and protection of the sponsor. In cases where the third party retains the original data, it is critical that appropriate expectations and responsibilities are clearly defined in a quality agreement (or equivalent). These agreements should include the following considerations:

  • Because the sponsor may, in some cases, receive only reports of final data (e.g., from sample analysis or from calibration, maintenance, and qualification), an audit for selecting an external vendor should challenge the process from raw data generation through to distribution of final reports to ensure the accuracy and reliability of raw data generated by the vendor. This audit should include a review of the mechanisms used to generate and distribute data summaries and reports.
  • Transparency around the vendor’s IT infrastructure (including the subcontracting of IT infrastructure to cloud-based providers) should be defined in a written agreement. This agreement should include notification of the unauthorized accessing of sponsor data (e.g., through a hacking attack).
  • Responsibilities, including clearly defined ownership and retention requirements/schedules, should be clearly described in the written agreement.
  • Where data are transferred between the sponsor and their vendor site, the written agreement should specify how this is done to ensure adherence to ALCOA + principles, including management of true copies.
  • All electronic data (or certified true copy thereof) should be retained and include a means to retrieve/read these data (including metadata such as audit trails, etc.). In particular, the vendor must retain the ability to read data from retired instruments through the retention period of the data (as defined by the sponsor).
  • Electronic data should be maintained in its original format unless otherwise defined/agreed upon with the sponsor.
  • Timing expectations for data retrievability should be clearly described in the written agreement.
  • The written agreement should define who is responsible for backup and archiving features. Any incidents with or changes to data archival, backup, or restoration should be clearly documented in accordance with appropriate procedures and reported to the sponsor within an agreed time period.
  • The written agreement should include expectations on confidentiality of all disclosed information. The written agreement should define who in the vendor organization has access to sponsor data.
  • The written agreement should define the requirements for data review. This should include review against ALCOA + principles (e.g., metadata such as audit trails, where applicable) and system-level reviews.
  • Any vendors using cloud service providers should be assessed to ensure infrastructure controls are in place, including infrastructure and services for change control, system backup/restore, and data archiving processes.

Additional considerations for data generated during instrument calibration, qualification, repair, and troubleshooting (can be Scenario A, B or C).

Data integrity controls for analytical instruments and equipment may be challenged during internal or regulatory authority audits and inspections. It is a regulatory expectation (1–4) that the integrity of supporting instrument data is robust and that risks to these data have been adequately mitigated. Some examples where data integrity controls (including appropriate change control) are needed include calibration and qualification of raw data, control of standards (e.g., reference standards, external calibrated test probes, etc.) used during calibration, and management of internal and vendor documentation.

There are several aspects of the data lifecycle needed to underwrite the accuracy of analytical data. Requirements include demonstration that an instrument can produce accurate results and is under adequate system controls. Adequate system controls relate to documented procedures that support initial qualification, periodic calibration and maintenance, instrument repair and troubleshooting (including change control), and periodic review of the calibration/qualification status of the instrument.

Analytical instrument qualification and/or initial calibration is critical to ensure data generated on an instrument is accurate (ALCOA). Data generated using systems other than the one under qualification or calibration (e.g., a thermocouple used to calibrate a chromatography column heater) should be subject to the same controls as other data. In a scenario where initial validation/qualification is performed prior to mitigation of data integrity concerns, consideration should be given to potential data integrity concerns of the qualification testing. In general, data generated during calibration, qualification/validation, or maintenance activities should comply with all the foregoing requirements. However, there may be scenarios where data are generated outside of the normal workflow. In these cases, deviations from the normal workflow should be evaluated for risk and documented appropriately, particularly for risks that may impact the attributability or accuracy of the resultant data.

Conclusion

While all three scenarios, as well as many permutations of them, are possible and may be managed, Scenarios A (vendor-generated data on sponsor instruments using IT infrastructure of the sponsor) and C (vendor-generated data stored on the IT infrastructure of the vendor) are strongly preferred from a compliance and risk perspective. If Scenario B (vendor-generated data on sponsor instruments using the IT infrastructure of the sponsor and non-standard processes) is employed, a risk assessment and mitigation strategy should be considered to ensure compliance with ALCOA principles. In any case, the considerations described in this article should be evaluated to achieve compliance.

Acknowledgements

This manuscript was developed with the support of the International Consortium for Innovation and Quality in Pharmaceutical Development (IQ, www.iqconsortium.org). IQ is a not-for-profit organization of pharmaceutical and biotechnology companies with a mission of advancing science and technology to augment the capability of member companies to develop transformational solutions that benefit patients, regulators, and the broader research and development community.

References

1. MHRA, ‘GXP’ Data Integrity Guidance and Definitions (London, UK, March 2018).
2. FDA, Guidance for Industry: Data Integrity and Compliance With Drug CGMP Questions and Answers (Rockville, MD, December 2018).
3. World Health Organization, Guideline on Data Integrity (Geneva, Switzerland, October 2019).
4. Pharmaceutical Inspection Convention Pharmaceutical Inspection Co-operation Scheme, Good Practices for Data Management and Integrity in Regulated GMP/GDP Environments (Geneva, Switzerland, November 2018).

About the authors

Thomas Cullen*, thomas.f.cullen@abbvie.com, and Cliff Mitchell work in Analytical Research and Development at AbbVie Inc. (North Chicago, IL); Julie Lippke and Joseph Mongillo both work in Analytical Research and Development at Pfizer Inc. (Groton, CT); Koottala S. Ramaswamy works in Global Development Quality at Merck & Co., Inc. (West Point, PA); and Thomas Purdue works in Quality at Boehringer Ingelheim Pharmaceuticals Inc. (Ridgefield, CT); all authors are members of the IQ Consortium.

*To whom all correspondence should be addressed

Article Details

Pharmaceutical Technology
Supplement: Outsourcing Resources
August 2021
Pages: s16-s20

Citation

When referring to this article, please cite it as T. Cullen, C. Mitchell, J. Lippke, J. Mongillo, K.S. Ramaswamy, and T. Purdue, “Data Integrity Considerations for Vendor-Generated Data Associated with Analytical Testing," Pharmaceutical Technology Outsourcing Resources (August 2021).