The Importance of Quality Data for Regulatory Submissions

Published on: 
Pharmaceutical Technology, Quality and Regulatory Sourcebook, March 2024 eBook, Volume 2024 eBook, Issue 3
Pages: 35–37

Regulatory submissions are a critical step in bringing new drugs and medical interventions to market, the success of which heavily relies on the quality and fitness of the data presented to the regulatory bodies.

Regulatory submissions are a critical step in bringing new drugs and medical interventions to market, and the success of these submissions heavily relies on the quality and fitness of the data presented within the submission package to regulatory bodies. Recent FDA analysis shows that 32% of study data in submissions had significant issues with data conformity. If a submission is rejected due to non-conformance with study data requirements, it does not progress to the FDA Electronic Submissions Gateway, nor does it enter FDA electronic document rooms, which is where the official FDA review process begins. Out of the new new molecular entity (NME)/investigational new drug (IND) applications that successfully pass the study data conformance screening, only 50% are approved by FDA on their first submission. There is then a median delay of 435 days to approval following the first unsuccessful submission, which in turn postpones the availability of crucial new medications to patients. The stringent requirements set forth by regulatory bodies such as FDA and the European Medicines Agency (EMA) are not just procedural hurdles, they are essential safeguards to ensure the integrity of the clinical trial process, patient safety, and drug efficacy.

The rapid advancements in medical science and data technology result in both opportunities and challenges in maintaining the highest standards of data quality. As the industry navigates through the complexities of clinical trials, the importance of accurate, reliable, and robust data cannot be understated. Data must accurately reflect the clinical trial’s findings, as inaccuracies can lead to incorrect conclusions about a drug’s safety and efficacy, ultimately affecting patient health and public safety. This is why adherence to standards such as those provided by the Clinical Data Interchange Standards Consortium (CDISC) are essential to ensure that data are consistent, interpretable, and can be efficiently reviewed by regulatory bodies such as FDA. Moreover, complete and comprehensive data are crucial; any gaps can lead to delays as outlined in this article, as well as questions from regulatory bodies to add to these potential delays as they are resolved. Equally important is ensuring that the data remains current and relevant to the specific investigational drug and its intended use, as outdated or irrelevant data can skew the assessment of a drug’s profile. These elements collectively uphold the integrity of data, which is fundamental in the decision-making process of drug approval and, ultimately, in safeguarding public health.

Challenges in ensuring data quality

Maintaining data quality is fraught with challenges. These can range from technological limitations in data capture and storage to human errors in data entry and analysis. The complexity of clinical trials, involving multiple sites and varying patient populations, adds another layer of difficulty. The following highlights the main issues each of these challenges can bring:

  • Complexity of clinical trials. Modern clinical trials are inherently complex, often spanning multiple countries and involving diverse patient populations. This complexity can lead to inconsistencies in data collection methods and difficulties in aggregating and standardizing data across different sites.
  • Technological challenges. While technological advancements have streamlined data collection and analysis, they also present challenges. Ensuring compatibility between different data systems and protecting against data breaches and corruption are significant concerns. Additionally, as technology evolves, so must the methods and protocols for data management to stay current and effective.
  • Human factor. The role of human error cannot be underestimated. Misinterpretation of data, incorrect data entry, and failure to follow protocols can all lead to compromised data quality. This underscores the importance of comprehensive training and rigorous oversight throughout the data collection and analysis process.

Each of these challenges requires a tailored approach to mitigate risks and ensure integrity and quality of data in regulatory submissions.

Read this article in Pharmaceutical Technology®/Pharmaceutical Technology Europe® Quality and Regulatory Sourcebook eBook.

About the authors

Claude Price is head of Clinical Data Management at Quanticate.

Article details

Pharmaceutical Technology

eBook: Quality and Regulatory Sourcebook 2024

March 2024

Pages: 35–37


When referring to this article, please cite it as Price, C. The Importance of Quality Data for Regulatory Submissions. Pharmaceutical Technology®/Pharmaceutical Technology Europe® Quality and Regulatory Sourcebook eBook (March 2024).