Quanticate Blog

4 Way to Improve Clinical Data Quality & Control in Modern Trials

Written by Clinical Data Management Team | Tue, Aug 20, 2024

 

High-quality clinical data is the foundation of reliable clinical trials, enabling drug developers healthcare professionals to make informed decisions and accelerate the development of investigational products. As clinical trials continue to shift toward digital platforms, ensuring the integrity and accuracy of data collected from various sources becomes increasingly critical.

Effective Clinical Data Management (CDM) is essential to ensuring data collection, entry, validation, and reporting are performed effectively to achieve collecting high-quality, credible, and statistically sound data of a clinical trial. Accurate and timely data can help healthcare professionals identify patterns, predict outcomes, and improve patient outcomes. High quality clinical data positively impacts the costs and efficiency of a trial but also enhances the regulatory submission and review experience, therefore resulting in an more efficient development of the Investigational Product.

For Quality Control, the guidelines based on Good Clinical Practices (GCP), which is an international standard of ethical and scientific quality for designing, conducting, recording, and reporting clinical trials with human subjects are traditionally followed by drug developers. Within data management, reviewing and maintaining data accuracy and quality is a dynamic process during the conduct phase of the study. "However, from the very initial study set up phase, clinical data managers work towards the aim of producing quality data by implementing the specifications required for standardised database design to produce structured datasets aligned to protocol requirement". Additional study documents like CRF Completion Guidelines (CCG) and/or Data Entry Guidelines (DEG) are created to ensure data are entered consistently by all entry staffs as expected.

The Evolution of Clinical Data Management

The shift from paper-based systems to Electronic Data Capture (EDC)  saw a shift in the way we look at the quality measurements of CDM activities. The paper world had a clear understanding that the quality of the clinical data collected was simply the quality of the transcription work teams performed of transferring data from paper to a database and data quality was primarily evaluated based on transcription accuracy. However, with EDC systems, transcription errors are eliminated, and the focus has shifted to ensuring the quality of the data collection tools and processes. The Quality Control (QC) of paper versus database had a set standard for sampling of √N+1 or 20 subjects, whichever was smaller and a 100% QC of critical variables. Acceptable error rates were 0.05% for random selection QC and 0% for critical data QC which was widely agreed across the industry. These thresholds were no longer necessary when EDC enabled sites to enter the data directly and transcription was no longer needed. However, it is the role of the data management teams to be involved in many efforts to prepare data for appropriate analysis and submissions. The growing complexity of data sources, including ePRO, eCOA, EMR, and wearables, demands a robust and integrated approach to data management

The quality of the efforts which result in developing data collection tools/eCRF and cleaning the data collected can directly impact the quality of the data collected. Thus, it is important for organizations to look into managing the quality of the workstreams the teams are involved in, especially as we are seeing increased streams of data being collected from various sources like eSource, ePRO/eCOA, EMR/EHR, wearables, mHealth, and AI based tools for adherence tracking, etc. The traditional thinking of an error rate is no longer an ideal solution to manage the expectations of quality rather quality must be nurtured as a habit or culture within teams handling data. Teams must also be qualitative in their approach towards measuring quality versus a quantitative effort of sample QC of the effort.

Implementation of effective data quality management systems helps improve data accuracy and reliability. Regular data trend analysis runs, using programs to identify outliers and deviations, and performing critical data review throughout the study conduct are few of the practices CDM can follow as part of validation activities to ensure data quality is maintained throughout the study conduct. Programming of listings for data review, electronic checks to validate the data in system (real time in case of EDC), reconciliation listings review are the most common tasks performed by CDM to monitor and maintain the data quality as part of validation activities throughout the study. Further using risk-based monitoring (RBM) approaches to focus resources on the most critical data, identifying errors and inconsistencies more effectively will enhance data quality throughout the CDM process.

 

Key Strategies for Improving Clinical Data Quality

 

Below are four key strategies for improving Clinical Data Quality which should also help to install a quality culture:

1) Effective Design and Review of Data Collection Tool (DCT) Design Specifications

Clinical trials are nothing but an expensive method of ‘collecting data’. If we are not designing the tool to collect the data properly, we create a gap which cannot be filled resulting in piling up the gaps with fixes which in turn ends up with teams taking additional efforts to ensure data quality impacting cost and timelines. Specifications are reviewed normally, however how effectively are we looking at the appropriateness of the design from the site’s point of view for EDC and from the patient’s point of view for ePRO? With the advent of policies like 21st Century Cures Act in the US, patient engagement is highly regarded as it helps data quality. Thus, we should be looking at more patient centric data collection specifications which can motivate sites and patients to provide accurate details to the questions asked in respective Case Report Forms (CRFs)/electronic Case Report Forms (eCRFs). For example, a patient suffering from a muscular dystrophy would be more interested in assessing how best he/she can do her daily chores or how well they can play with their grandchildren rather than measuring a 6-step walking test to be reported everyday.

Using validated user-friendly EDC system which is compliant to regulatory requirements and aligned to Industry best practices plays vital role in designing an efficient eCRF and setting up the required integration by clinical data managers to serve the quality requirements and fit for purpose data needs. Organisations should prioritise patient-centric design to improve the quality of responses and overall data integrity. For example, integrating patient feedback into the design process can enhance engagement and accuracy. Furthermore, the data collection set up should be targeted towards data points that are only relevant towards the study objectives, which aids in higher quality data collection.

 

2) Automations and Integrations

Using clinical data management systems that are interoperable with other health systems (e.g., EHRs, laboratory information systems) enable seamless data exchange and reduce manual transcription errors. Use of APIs to integrate disparate data sources enables automated data flows and thus minimizing the need for manual data handling, which can introduce errors.

Reducing manual interventions in data collection is considered to be the future, where solutions which enable EHR/EMR integrations play an important role. Use of medical grade devices to collect data directly from patients when using wearables and the mHealth tool would help calibrated data to flow into integrated EDC databases with minimal or no interventions. AI based tools can collect medication adherence data without human intervention. In addition, using integrated eCOAs, Central Lab APIs, Medical coding, Imaging and safety data workflow with EDCs will help centralised data collection with minimal manual intervention in data transfer from varied sources and is currently a preferred set up many drug developers. Using EDC solutions with associated tools like eConsent, eCOA/ePRO, Imaging, Safety Gateway etc. within the same architecture also help save time and effort setting up and monitoring integration. Overall, ensuring the overall data flow has minimal manual intervention can create opportunities for better quality data. This seamless data flow across various platforms ensures that the data is accurate and fit for purpose.

3) Data Standardisation

Implementing data standards early in the project lifecycle, such as CDISC-compliant eCRFs and standard mapping algorithms, can streamline the data management process and improve the consistency of data across multiple studies. Automation of steps converting the data collected to standards would enhance quality as well as efficiency. The process starts from developing CDISC compliant eCRFs to implementing standard mapping algorithms earlier in the project lifecycle than usual so that the SDTM requirements during the conduct of the study would be addressed seamlessly with improved quality. This helps to streamline the downstream; statistical programming requirements and make them more efficient, accurate and consistent across multiple data releases within the same study or across a program or portfolio of studies.

4) Continuous Training Knowledge Sharing

We all know less human intervention can bring in more quality as it reduces the chance of errors; however, planning the automation and integration to support goals set is ultimately important. The setting up of all systems must ensure the people involved have a greater, broader and deeper understanding of the end-to-end process flow. Generic and study level trainings have become just an onboarding routine. Developing a culture of quality requires ongoing education and training. Developing comprehensive understanding with effective training is key to making teams deliver ‘first time quality’. Training should focus on aspects of effective study set up and conduct conceptualised from a blend of technical and clinical knowledge for ensuring that teams are well-prepared to maintain high data quality standards. Refresher training should be conducted/provided whenever there is any amendment to the protocol, eCRF, or any relevant study documents impacting CDM.

Data management teams should be encouraged to develop skills in data analytics, enabling them to better identify trends and outliers in the data that could indicate quality issue. An effective success measuring strategy for training and on the job mentoring efforts could take us a long way in ensuring the quality of data collection. Organisations should also encourage knowledge sharing platforms within their infrastructure enabling teams to create various communities of learning.

 

Quality Control and Assurance in Clinical Trials

Quality control (QC) ensures that data is accurate and reliable throughout the clinical trial process. QC activities include real-time data checks, reconciliation listings, and regular audits. Meanwhile, quality assurance (QA) provides a framework for compliance with industry standards, such as Good Clinical Practices (GCP), and ensures that data management processes are consistent across studies.

To ensure maintenance of data quality, organisations need to invest in data governance, data management, data analysis and reporting tools which aid in continuously monitor and improve data quality. Efficient managing of quality of clinical data ensures integrity, accuracy, and consistency of data from entry into the CRF to final datasets reported in final CSR.

From Quality Assurance (QA) perspective, some of the Data Management metrices that are considered to evaluate the quality are percentage of manually raised queries, percentage of database errors, time taken from last patient last visit (LPLV) to database lock, number of database unlock instances etc.

 

Conclusion

In the digital era, maintaining clinical data quality requires an extensive approach that combines effective design, automation, standardisation, and continuous education. By fostering a culture of quality and leveraging technological advancements, organisations can ensure that their clinical trial data is both reliable and actionable, ultimately contributing to better patient outcomes and more efficient drug development.

 

 

 

Quanticate’s Clinical Data Management team are dedicated to ensuring high quality clinical data and have a wealth of experience in data capture, processing and collection tools. Our team offer flexible and customized solutions across various EDC platforms. If you would like more information on how we can assist your clinical trial submit an RFI.