The importance of biometric functions to improving data quality in Clinical Trials

Reducing the costs associated with clinical trials is one of the pharmaceutical industry’s core challenges. Running a clinical trial is incredibly expensive and if a compound fails in the late phases, it can be catastrophic for its sponsor company.

One of the areas where efficiencies can be found is in clinical data management systems and processes. Better management leads to higher quality data which can reduce time scales and prevent costly reworks during the reporting stages.

Data quality can be further improved by using statistical analysis and programming to identify patterns of data issues not normally identifiable by standard data processing techniques.Whilst traditional methods may identify a single atypical data point, Data Quality Oversight methods identify atypical centres, countries, regions and patients. Any risk of a delay to a regulatory submission will be substantially reduced by ensuring high quality data have been collected as quality data are required to ensure the integrity of the trial results; a requirement of ICH E6 (R2).

HereKaren Ooms, executive vice president and head of statistics at Quanticate discusses,the main considerations for managing and reporting on clinical data, negating complexities with centralised data systems and discusses the innovation that is driving a change in clinical data management approaches and the importance of statistical consultancy.

Statistical consultancy
Statistics is regarded as one of the core disciplines of pharmaceutical research and relies on the collection of objective data to demonstrate clinical effect.
Supervising data collection to provide meaningful and measurable quantities; coding and manipulating data for a suitable analysis having selected from a wide array of possible analysis options; and interpreting data with an appreciation of chance are all the responsibility of the statistician. They bring an objective approach, consistent with their training (for example, they might rigidly insist on predefined ‘hypotheses’) which, coupled with the knowledge and optimism of clinicians and scientists, provides a framework for solid scientific advance.

Importantly, statistical consultancy influences a sponsor’s clinical trial in several ways. The service offers independent, external statistical advice, and can cover all areas from optimising study design, procedures and strategies, to the analysis of study data and informed interpretation of results. It is therefore recommended that statistical consultancy advice is sought from the inception of a clinical trial.

Ultimately the statistical consultant aims to help resolve any challenges and increase the chances of a successful outcome. This can reduce the timelines to submissions and with approaches such as interim analysis, enable a study to be stopped earlier if it appears to be failing or even successfully fulfilling the desired outcome.

Managing complexity with centralised data systems
The challenges around data remain consistent throughout all phases of a clinical trial. No matter the phase, patient data are first captured, either at a clinical site or through a device in the patient’s own home, and then uploaded to a database. The challenge for those running the trial is in managing the data,which tend to increase in complexity and volume as studies progress, and then in reporting it to regulatory bodies.

One of the simplest and most effective ways of managing this complexity is to centralise data systems. Collecting and storing clinical data in one place, or with one company, allows for a more efficient process. Integrating data handling activities across clinical data management, biostatistics and medical writing services allows for effective planning and communications across functions.

If companies use a centralised data system all the clinical trial data should be collected and stored to the same standard and in the same format, making the compilation process at the end of a study, a more streamlined effort. The quality of data often improves because of this standardisation.It can also have a positive impact on the timeframes of a clinical trial, particularly when it comes to reporting to regulators.

Entrusting one team with collecting, monitoring and analysing data and using a centralised data system allows sponsor companies a higher level of data visibility without the need to transfer information between multiple parties. It also makes it easier to spot mistakes and allows for efficient editing from one place, making the whole process less complex and reducing the potential impact on quality.

Regulatory requirements
Ultimately, the aim of a clinical trial is to collect data that can be submitted to a regulatory authority.
The pharmaceutical industry’s ability to generate the revenue it needs to fund future R&D relies on the approval of product candidates, which in turn depends on the ability to prove that they work in clinical trials. This requires high quality data and an understanding of regulator demands.

Clinical trials are subject to stringent regulation from regulatory bodies such as the Medicines and Healthcare products Regulatory Agency (MHRA), the European Medicines Agency (EMA) and the Food and Drug Administration (FDA).

A lot of these regulations surround how clinical data are described and reported throughout a trial. Failure to demonstrate how the relevant regulatory requirements are compiled in clinical study documentation can have a serious impact on a compound progressing through clinical trial phases to commercial production. For example, during the preparation of an integrated summary the medical writer must compile the results of multiple studies and draw conclusions based on the data, while following the regulatory guidelines. Data may be integrated to provide a larger database for regulatory review – again, consideration of this in the planning stages of clinical trials makes for a more robust submission.

Another area, and one in which regulations are playing a key role, is the data standards used across trials.The FDA prefers that data are provided as per the Standard Data Tabulation Model (SDTM) where the clinical data collected for analysis are mapped as per SDTM domains.  This can be especially challenging when mapping from numerous legacy data sources. This is part of the Clinical Data Interchange Standards Consortium (CDISC) standards, a non-profit standards development organisation.

Understanding these nuanced differences requires detailed market knowledge as there are significant and sometimes fundamental differences between regulatory bodies.

Innovation in clinical data management

The growing complexity of trials is driving innovation in the development of patient-centric management and data collection technologies.

Technologies like electronic patient reported outcomes(ePRO) are increasingly being used to allow remote participation in trials. Trials involved in the field of rare diseases are good examples of where this technology is being used as supportive evidence in regulatory submissions.  Ensuring that ePRO devices function in a way that is secure and robust is vital.

Similarly, technologies that integrate third party data are being used to improve patient compliance with study protocols. An increasing number of regulatory agencies accept data from wearable devices like activity trackers in clinical trial data submission. While this integrated approach improves the quality of data and aims to reduce the time required for data analysis,from a management perspective it further increases complexity.

Data analysis technologies are evolving at the same rate as data capture tools. Along with enhanced data integration, many tools offer analytics and visualization technologies that aid interpretation. This ability to interpret data visually can give users the status of a study at a glance of the dashboard. Visualisation techniques used extensively in the newly developing field of Data Quality Oversight work in combination with traditional data management processes to improve the overall quality of the trial data.

By viewing data in real time before database lock via industry recognised visualization tools, a sponsor CRO can provide data quality oversight by running statistical algorithms to detect erroneous data and outliers at clinical trial sites. This allows for a data focused CRO to advise or bring to the attention of the sponsor or partner CRO, of any problematic sites, so that they in turn can send monitors to site. Sponsors are therefore able to improve site performance and detect data issues early, resulting in no nasty surprises on submission to regulatory bodies.

Final thoughts
Developing approaches for clinical data management that are inherently efficient and mindful of regulatory requirements is key to a successful trial. Integrating data handling activities across services allows for effective planning and communications across functions which lead to leaner processes and the eventual provision of better data for regulatory submission.

With the increase in connected devices and technological advancements, the ability to analyse large amounts of data is increasingly important. Visualising study data during study conduct to improve data integrity will improve the success of a clinical trial and remove any surprises at study submission.

Statistical analysis is at the heart of processing of all clinical data. Even with improved data innovations, efficiencies, technology and standardisations, we should not forget the power and importance that a consultant statistician can play in pharmaceutical research. They can direct and guide the decision making process to prove an Investigational New Drug (IND)’s hypothesis and prevent costly wrong turns being taken during the research programme. Ultimately a statistical consultant will help sponsors to make the right decisions, improve timelines and correctly interpret the clinical results.

These core principles, in which a niche data focused CRO is well rehearsed, improve data quality and support sponsors with their submissions as they look to develop their new drug or device.

 

Author: Karen Ooms

Author Img

Karen Ooms is responsible for overseeing the Strategic Delivery Business Unit (SDBU). This includes the management of the clinical data management, biostatistics, programming, medical writing and pharmacovigilance departments. Karen is a Chartered Fellow of the Royal Statistical Society and has a background in biostatistics spanning over 25 years. Prior to joining Quanticate in 1999 (Statwood), Karen was a senior statistician at Unilever.

Company: Quanticate


Quanticate is a global data focused clinical research organisation (CRO) primarily focused on the management, analysis and reporting of data from clinical trials and post-marketing surveillance. Quanticate provides high quality teams that offer efficient outsourcing solutions for clinical data management, biostatistics, SAS programming, data quality oversight, medical writing and pharmacovigilance. Quanticate can offer study level support, functional service provision (FSP), strategic full data-services solutions or technical consultancy to meet the needs of pharmaceutical, biotechnology and device companies across the globe. Visit www.quanticate.com for further information.

Send Enquiry for this story

By submitting this form you are giving a consent to Worldpharmatoday.com to store your submitted information.
See our Privacy Policy to learn more about how we use data.