“Clickbusters” can bust more than clicks

Craig-Joseph

I was fascinated by a recent article in the Journal of the American Medical Informatics Association (JAMIA) titled “Clinician collaboration to improve clinical decision support: the Clickbusters initiative.” The paper describes how a group of clinicians, informaticians, and researchers from Vanderbilt University Medical Center (VUMC) developed and executed a design to “improve safety and quality and reduce burnout through the optimization of clinical decision support (CDS) alerts.” I’ll cut to the chase: they reduced alerts that weren’t effective, which subsequently reduced wasted clicks. But equally exciting (at least to me!), was that in the process of improving their CDS, they created a cadre of clinicians that were invested in improving how their colleagues use the electronic health record (EHR). This “side effect” where the researchers created a team of doctors and nurses who understand CDS and the EHR will be an essential tool for clinical practice efficiency and improvement for a long time. 

When EHRs were initially being developed and implemented, CDS was a key part of the “sale process” to physicians. One can imagine the argument (or rather, sales strategy): the EHR will help doctors and nurses do the right thing at the right time by reminding them of things they might not remember to do. The corollary was also true: CDS should be able to help clinicians stop doing something inadvertent that might be harmful to the patient. Fast forward to the present, and we find that CDS isn’t as miraculous or helpful as we had hoped. As the authors point out, “[a]lert nonadherence, or clinician overrides, is one key barrier that occurs for 49–96% of alerts. Overrides may result from an excess of alerts that are repeated or deemed irrelevant (i.e., a high alert burden), causing alert fatigue and reducing the efficacy of CDS.”  

Clinical decision support is not synonymous with alerts; alerts are one type of CDS. A good example of an alert is a pop-up box that informs a physician that they are about to order a medication to which the patient is allergic. But clinical decision support can be so much more. Consider functionality that replaces the default antibiotic in an order set with another based on the patient’s allergies. CDS can be used to automatically notify the social work team that a patient might benefit by enrolling in a benefits program based on questions that the nurse is answering. No alerts here, but clearly, this is clinical decision support.  

Now, what can be done about those pesky alerts that are overridden most of the time? The Vanderbilt team leveraged the Plan-Do-Study-Act (PDSA) model for quality improvement for their Clickbusters program. They created a ten-step process that began with reviewing alert logic, triggering frequency, and acceptance data. They then tried to look at the alert through the eyes of a typical user, reviewed the clinical evidence supporting the need for the alert, and looked for ways to improve it. If it could be improved, they discussed possible changes with key stakeholders, tested those changes in a build environment, moved them into production, and finally, evaluated and shared the outcomes.  

So, what happened? “After [two] rounds, the Clickbusters program resulted in detailed, comprehensive reviews of 84 CDS alerts, which comprised 20% of the rule-based alerts implemented at VUMC, and reduced the number of weekly clicks by more than 70,000 (15.43%). While modest, these results occurred in a short period and involved motivated users that had not previously participated in CDS review.” I’m not a big believer that mouse clicks in and of themselves are bad; I believe that all unnecessary mouse clicks are bad! Still, reducing clicks by 15% is most definitely a good thing. 

But wait; there’s more. The researchers noted that “Clickbusters allowed [them] to bring in new participants who were highly invested to help with the review process both as end users receiving the alerts and Clickbusters participants motivated through the gamification in the program.” Did someone say “gamification?” Indeed they did! These folks incentivized their clinical and operational volunteers by giving them points as they worked through the decluttering process.  

Here's how it worked: “Participants received up to 15 points for each alert improved through Clickbusting: 4 points for analysis, 2 points for design, 4 points for build, and 5 points for evaluation. This score was then multiplied by the sum of the burden and complexity [scores] determined during the alert identification process to determine the final number of points.” The more thorough work a participant did, the more points they got. And if they attacked a particularly knotty problem based on how complicated the alert build was, they got even more points. The top three point winners got gift cards, but much more important to me (and perhaps to them), they got a custom-designed “golden mouse” trophy. To be honest, I’d say keep the gift certificates and give me more custom-designed trophies, thank you very much! 

While I remain impressed by the improved clinical decision support tools that VUMC earned from this intervention, I am perhaps even more impressed by the group of involved, informed, and dedicated clinicians that didn’t exist before. Many of these folks got their first taste of clinical informatics and behind-the-scenes EHR build and configuration via the Clickbusters program. I would posit that they will all continue to contribute to the cause of better technology for many years to come, and that might be the bigger win.  

Topics: EHR, featured

Module heading text

Get the highest quality chemistry and microbiology testing services aligned closely with current good manufacturing practices (CGMP) for all types of products across all phases of development.

Subscribe to receive blog updates