Doing It in Real-time and Reaping the Rewards!
Posted by: Joe Martinez, RPh, PDE, PPC
Clinical trials are more complex, more involved and span the globe. Clinical trials involve people, patients, stakeholders, researchers, administrators, and regulators, to name just a few. Next: add on training, monitoring and oversight to provide quality data in a short time-frame and you have a recipe for disaster! What’s a poor boy (or girl) to do?
Proactive and real-time data checks are the answer, please keep reading. In today’s environment, for investigators to stay within budget and defined trial timelines, it’s critical to identify and address challenges and issues as pro-actively as possible.
How? By employing a strategically important system of edits and algorithms that run along your data input to provide real-time alerts.
Easier said than done? Maybe. But certainly possible, by following some simple guidelines. 1) Allow real-time access for system and systematic queries for data input, rates and outliers to ensure accuracy, 2) Identify the critical data and outlier ranges, rates and trends that are specific to your therapeutic area or disease model, and 3) Implement a process and training to research and validate these alerts as near to the input event as possible.
Clinical trials are becoming more complex and with a much higher risk for human error during data entry than was previously experienced . In addition, clinical trials are more competitive now than in days past and this competition extends out to sponsors, CROs, sites and even potential patients . Current market trends and regulations provide even more motivation for real-time data quality checks.
In short, trials are growing more complex and with a higher level of error and risk. Both the current regulatory burden, tight timelines and program costs to bring drugs to market make a compelling argument to implement real-time and systematic data checks now. Many tools for pro-active data quality verification are being developed and are also currently available.
Tools such as risk-based monitoring strategies, input data algorithms, cross-checking, source data review and validation, can provide the much needed relief and functionality to help sponsor companies and investigators provide a stream-lined process for high quality clinical data, that comes in under budget and on time.
Coming up next in our blog series: Now that you have a data cloud, how do you prevent it from becoming a data swamp and your trials data swollowed by the crocodile of largess…