Realizing the promise of electronic data capture in clinical trials
Praveen Dass discusses why capturing data that is accurate and timely is one of the critical elements that predicates the success of a clinical trial.
Add bookmarkCapturing data which is accurate and timely is one of the most critical elements when predicating the success of a clinical trial.
Clinical Trial Management Systems (CTMS) are integral to every clinical trial. Selecting the right CTMS can help to address inefficiencies on the operational side of research, such as clinical trial planning, preparation, performance and reporting. In addition, there is a growing need to address the complex process of electronic data capture (EDC) implementation, as more and more pharma and biopharma sponsors start to recognise the potential opportunities that exist with EDC-CTMS integration.
EDC systems (software applications that allow users to enter information into a database over the internet) are increasingly used to help medical device and pharmaceutical companies achieve maximum efficiency when it comes to entering data, structuring a database and conducting analysis for clinical trials. These systems provide the tools and the process infrastructure necessary to achieve the required data quality, as well as process scalability.
EDC technology has evolved over the last decade with fewer concerns for poor data integrity through human error and connectivity issues derived from less reliable IT systems. Hospitals all over the world are now better equipped, staff are a lot more comfortable with electronic systems and stakeholders are more aware of the time that can be saved using EDC. However, as the medical industry embraces new technologies and innovations, including electronic record keeping, the volume of information collected before, during and after the clinical trials continues to grow. As such, comprehensive data collection and efficient management is now becoming a priority for pharmaceutical companies and CROs.
Addendum E6 revision 2 (R2) & Risk Based Monitoring (RBM)
Since the development of the International Council for Harmonization (ICH) Good Clinical Practice (GCP) guidelines, the scale, complexity and cost of clinical studies have increased. The E6 R2 addendum came into effect in Europe on 14 June 2017 and includes several topical GCP inspection areas.
The guideline was introduced to encourage the implementation of more efficient approaches to clinical trial design, conduct, oversight, recording and reporting while continuing to ensure human subject protection and data integrity. This is likely to lead to greater adoption of EDC technology. As the industry embraces new opportunities to increase efficiency during the management of clinical trial data through remote and risk-based monitoring (RBM), the need for sponsor oversight of both systems and data increases.
RBM was designed to identify incorrect data and predict and mitigate risks before they happen. It is now being used more frequently during clinical studies and is set to be a fundamental change for the industry. RBM is enabled because EDC exists as most of the data analyzed comes from the EDC system in real-time. This addendum to the GCP incorporates the use of RBM approaches to help companies ensure the safety and quality of clinical trials.
This move will undoubtedly require a shift in mindset for quality assurance (QA) and monitoring teams. It is evident that this new addendum requires a concerted educational effort by many organizations to allow their workforce to adopt a new way of working. Human error can have many repercussions in clinical studies, which could ultimately delay a product to market, or stop it getting there at all.
Read the report to learn the importance of using data to create scientific value and how to digitize data for more efficient lab work. Download the Report.Pharma IQ Special Report: Unlock the Power of Data to Advance Pharma Businesses
Making the transition to EDC
The advent of EDC technologies is shaping the clinical trials data management landscape, offering many benefits for the industry.
Adoption of EDC was initially slow. However as users began to understand what could be achieved with EDC and the limits of paper case report forms (CRFs) or standard databases, a stronger case for investment has been built. The last few years have seen a greater shift in the uptake of EDC with paper CRFs now only being considered for specific requirements. The benefits of EDC have been shown across many studies. These include:
- EDC has been show to cut (on average) 41% of pre-study preparation time. Electronic case report form (eCRF) templates are easily modified to suit each new study, saving time that would be spent on designing and producing paper CRFs.
- Using EDC, data is collected and entered into a data collection tool only once. When using a paper system, data must be entered firstly into a case report form then into an electronic system by a data entry group. This not only increases processing time but also affects transcription integrity.
- EDC allows data cleaning to take place straight away and doesn’t require intensive hands-on work from a data management group for processing. Unlike paper studies, in which the data management group executes the logic checks against data that was collected weeks or months ago, EDC system logic checks are executed when the site enters and submits data, allowing it to be cleaned in real-time. This is one of the major efficiencies of an EDC system.
Ultimately, following proper system selection and development, as well as good study management, EDC allows users quick access to clean data with low operational costs. It has been reported that switching to EDC saves on average 30% of the time it takes to conduct a clinical trial.
EDC database design
There is a need for some up-front investment in terms of time and resource to ensure that each trial database is thorough. If companies do not make this investment there is a risk that changes will be required later which will be costly and could have broader implications on operations.
Most EDC systems come with a standard suite of reports, however, most EDC vendors provide custom reporting in addition to standard reports. As with database development, the design of reports should be well considered and clearly specified at the beginning of the study.
It is important to consider the format in which data is being collected so that they can be statistically interpreted or programmed in other third-party software once it leaves the EDC system. Implementing a consistent data collection methodology includes standardising the definitions for the data that has been collected across multiple sites.
EDC systems must be able to follow Clinical Data Interchange Standards Consortium (CDISC) standards and for datasets to be in line with Clinical Data Acquisition Standards Harmonization (CDASH) guidelines so that it is ready for statistical analysis in the future. This can save time and efficiencies as non-CDASH data needs to be re-worked to meet a CDISC standard upon regulatory submission. The use of greater number of standard designs during study builds can not only help with efficiency and quality of reporting, but also significantly reduces the time it takes for databases to be built. While CDASH standards continue to improve, many organizations still find it a challenge to keep to these standards due to differences in requirements between studies and even different clinical study teams within organizations.
42% of the industry claim efficient lab data management will have the biggest impact on industry growth in the next five years. Find out how to get ahead of the curve. Download the Pharma IQ Report Now.Achieve Digital Business Success in Pharma with Data-Driven Approaches
Looking to the future
Cloud-based EDC systems have proven to significantly reduce development cycles and time to approval compared to traditional paper CRF processing and the industry continues to move towards a wider adoption of EDC methodology. There are number of EDC options in the market place, however, easy user experience is key and there is a lot of focus from providers to make the user interfaces as easy to use as possible to streamline staff training.
It is likely we will see integration across various platforms with EDC to enable data to have one source of truth. We are seeing this today with the use of wearables in clinical studies for data collection, as well as Electronic Patient Recorded Outcomes (ePRO) solutions where patients can enter data themselves directly into a tablet or e-mobile device. This negates late patient data entry and allows for reasonably continuous access to patient data via online tools.
Because of these emerging technologies, clinical data managers need to consider how the EDC platform can integrate data from an e-mobile device, and what analysis is required to identify which pieces of data can be trusted and used. Statistics and data modelling are often required to process this raw data, understanding protocol and likely error sources is required before processing of this nature can be performed, typically requiring statistical support.
As clinical studies incorporate these additional sources, the volume and frequency of data being captured has increased dramatically. For example, previous trials where a patient visit occurs once a week, with a single pulse measurement taken then, may have been replaced with per-second heart rate readings where protocol includes a heart monitoring device (such as in a commercially available sport tracking device).
Filtering and processing the raw data is required to produce clinically-useful information in reduced volumes. Continuing with the example, per-second values may be unhelpful, but knowing average resting vs active rates, along with any times of very low or very high readings, can inform efficacy and adverse event reporting. To support this potentially unlimited real-time data, technology stacks need to be able to handle storing and analyzing ‘Big Data’ in clinical trials.
Final thoughts
With the advent of the cloud in clinical trials, as well as the increase in connected devices and technological advancements, the ability to analyze large amounts of data is important. Technology to combat this is continuously evolving as new projects, products and approaches are launched. Some are more about updating existing methodology while others take the latest research to re-visit the core of the current ‘Big Data’ offerings and replace them with faster, more flexible and scalable solutions.
Outsourcing the data management and analysis of a study to a CRO with a comprehensive EDC system can mitigate a lot of the challenges for companies looking to adopt these new techniques.