chromosome

trends & perspectives in clinical research

goBalto Chromosome

Myopic Focus on Dashboards May Blur Insights

by Craig Morgan
07Aug2017

There is intense pressure to speed clinical trials and restrain costs. Given the burden of this duality, clinical project managers are expected to make smarter decisions on intelligence derived from clinical trial data—at a faster pace, while sponsors and contract research organizations (CROs) are looking for ways to incorporate business intelligence (BI) into the eClinical systems they are using to empower oversight—turning raw trial data into actionable information.

Myopic Focus on Dashboards May Blur Insights

Of paramount importance to project managers in this endeavor are analytical dashboards. However, like the data they display, dashboards alone do not provide value; it's up to experienced clinicians to distill insights.

Massive volumes of data are generated during clinical trials, but they are woefully inadequate at helping stakeholders spot risk factors and bottlenecks that can disrupt cycle times and budgets. This is due to the inefficient ways in which operational data are captured and analyzed, often relying on outdated methods such as paper, shared file drives, and Excel, which lacks much needed project- and risk management functionality.

These shortcomings are particularly acute during study startup (SSU), a phase that is widely regarded as complicated, slow, and in need of better operational tools. It is also pivotal to successful clinical trial operations. Specifically, the Tufts Center for the Study of Drug Development (CSDD) has reported that SSU is a major cause of long cycle times, which have stagnated for two decades, and that eight months is an average timeframe for moving from pre-visit to site initiation. Anxious to improve SSU operations, stakeholders are embracing solutions with automated workflows that guide team members through the many steps involved and provide alerts for tasks needing attention. These tools are purpose-built, show a deep understanding of SSU, and enable users to comply with regulatory metrics at the country and site levels. They also allow users to develop performance metrics, which are the basis for predictive analytics, and are critical to building a dynamic atmosphere of continuous improvement.

Purpose-built SSU systems which utilize advances in data analytics and visualization have now become an integral component of decision support systems, aiding compliance and affording clinical research teams an opportunity to intervene before the effects of a risk have been incurred. Risk mitigation is therefore optimal using systems which can provide timely, preferably real-time data on trial bottlenecks, which indicate deviations to be reviewed and addressed or at least tracked carefully throughout the trial.

Analytical dashboards are a logical evolution from earlier practices of using control charts, scorecards, metrics-based snapshots and other measurement techniques to gauge progress. The advantages of dashboards to monitor the progression of a trial and compliance with budget/timeline goals cannot be overstated—aiding important communication and planning activities amongst all stakeholders. Well created dashboards guide tactical and reactive responses, empowering trial management, but they don't make those decisions.

As suggested by CSDD research, it is critical to track the sub-steps involved in SSU tasks, and who is responsible for each. Without this information, only the overall times for the task is known, but it cannot help determine where bottlenecks may be occurring.

Contract execution provides a good example of a process with many sub-steps (Figure 1), and why each must be tracked to avoid bottlenecks. For example, All Contracts Executed, normally available in an electronic trial management file (eTMF) as a summary of artifacts does not provide any metrics on the sub-steps that precede it. Without those metrics, stakeholders would be unaware of any issues until the planned date for All Contracts Executed was reached.

Contracts_Operational_Metrics.jpg

Figure 1

Blurring insights

When utilizing operational metrics, dashboards capture and display metrics associated with an increasingly smaller slice of time/artifact activity, which is critical when reviewing processes for optimization and to providing an organization with a competitive edge. The risk posed, however, is that detailed and elaborate dashboards may lead to suboptimal decisions being made due to information overload.

Smaller slices of information that populate dashboards can make them more susceptible to outliers. And while it is critical to investigate the causes for these outliers — outside of a confidence range based on an internal or industry benchmark — focusing on spikes can make managers reactive.

Before hunting for insights, project managers must have a clear idea of what is actionable. Companies that travel down the rabbit hole of data exploration will lose time and waste employee energy unless they can set parameters on what they want to achieve.

Jeff Kasher
JEFF KASHER
President
Patient's Can't Wait
"Turning big data into big insights requires analytics that are actionable. Performance metrics must be data-driven, standardized across studies, indication, and therapeutic areas, and timely. They must also, importantly, facilitate a forum for discussion."

Jeff was formerly VP Clinical Innovation and Implementation at Eli Lilly and Company

At the outset of the trial it is critical to carefully delineate the role the dashboard(s) should take and for which groups to avoid the "more is better" trap. Senior management dashboards will look quite different from operational dashboards. Likewise, which metrics will be used in status reports versus those used merely for monitoring purposes should be defined. A useful dashboard can help set goals, monitor performance, provide implementation metrics and help provide strategic insights. Project managers must also decide on the frequency and granularity of data used in dashboards.

Dashboard metrics need to be comparable to an internal benchmark or preferably an industry benchmark, otherwise they can mask strategic trends. Without a control group, no definitive conclusions can be drawn.

Dashboards are a useful tool for shedding light on process bottlenecks and offering insights into complex, multi-site, global trials but the tool shouldn't be the primary focus. In the rush to seek out quantifiable information, dashboards can push clinical teams to focus on analytics rather than quality of data, or quality of care. What degree of confidence do you have that the data has been entered correctly? What auditing is in place—not just for regulatory data that is subject to audits, but also performance data? After all, if the competitive advantage is built on poor quality data it might be more perception than reality. Is a metric increasing or decreasing, and by how much?

The "what" pertains to analytics, while the "why" and "how" provide insight.

A clear understanding of the "why" helps clinical teams focus on processes that need optimization and timely, on-going monitoring to improve future outcomes. A focus on how, by design, is forward-looking and requires an attention to detail to understand the drivers of the desired outcome and an understanding of the process. Organizations that can supplement their dashboard with a process to answer the "how" create an enduring and unique capability. Dashboards should be leveraged as an engagement mechanism within clinical teams and across clinical functions, and not merely as a reporting/monitoring tool. For example, negative cycle times, an obvious error, should lead to process discovery/improvement discussions and not be merely dismissed from reporting results.

W. Edwards Deming, Joseph M. Juran, and Walter A. Shewhart introduced dashboards, in a rudimentary form, within the context of total quality management (TQM). The genius of TQM was not in visual analytics, but rather using visual analytics to generate insights. TQM did that by embedding the analytics in discussion forums such as quality circles.

Discussion forums encouraged employees at all levels to review and elaborate upon the information presented by dashboards. The insights generated helped address decision making, enhance manufacturing processes, frame the goal-setting process and develop incentives.

Frequently, it was not the metrics, but the discussions facilitated by the metrics, that led to quality improvement suggestions by employees. Dashboard analytics are important, but insights based on a discussion of the analytics are paramount.

Dashboards are not an end

Dashboards must present metrics that are timely, tailored to specific stakeholders, relevant to the task in hand (e.g., optimization, quality, compliance, risk management), and comparable to benchmarks, to avoid insights becoming obscured by data overload or omission.

Dashboards are not an end, but a vital means to facilitate discussion. Dashboards replete with numbers will not provide insights. They are a critical tool for clinical project managers to use to facilitate discussions, generate ideas and motivate employees. If used correctly they can bridge the journey from analytics (what) to insights (why, how).

Big Data and automated processes are giving hope that actionable insights will automatically appear, as if by magic. However, finding actionable insights is driven as much by smart humans as it is by actual data.

Article published in Bioscience Technology, March 2017

Recent Posts