Nothing is more important than placing, at the center of the design, the people who will be using the technologies being implemented. While new advancements in AI and predictive modeling are exciting, they won’t truly work unless the people who interact with them are involved in the building process. Incorporating physician feedback into the tools they use every day is critical to continued success. Customization of an EHR can help improve not only the physician experience, but the entire healthcare ecosystem, and most importantly, patient outcomes.
In this episode of the In Network podcast feature Designing for Health, Assistant Professor of Biomedical Informatics at the Vanderbilt University School of Medicine Dr. Allison McCoy sits down with Nordic Head of Thought Leadership Dr. Jerome Pagani and Chief Medical Officer Dr. Craig Joseph and discusses her research and work as the director of the Vanderbilt Clinical Informatics Core. They also talk about creating customized workflows for physicians, the value of real-world observations, and what it means to be a “clickbuster.”
In Network's Designing for Health podcast feature is available on all major podcasting platforms, including Apple Podcasts, Amazon Music, Google, iHeart, Pandora, Spotify, Stitcher, and more. Search for 'In Network' and subscribe for updates on future episodes. Like what you hear? Make sure to leave a 5-star rating and write a review to help others find the podcast.
[01:19] Dr. Allison McCoy’s background
[03:26] The Clickbusters initiative
[08:24] Gamifying clinical decision support
[10:27] Customizing the EHR
[11:15] Gauging the usability of workflow designs without being an expert
[12:33] Directing the Clinical Informatics Core
[14:09] Taking clinical informatics students on field trips
[15:59] Translating research into patient and clinician outcomes
[17:28] Utilizing physician feedback
[19:44] How the larger hospital ecosystem affects product success
[21:53] Dr. McCoy’s recent research
[22:30] Generative AI in clinical decision support
[25:27] Generative AI in clinical diagnosis
[26:55] Things so well-designed, they bring Dr. McCoy joy
Dr. Craig Joseph: Dr. Allison McCoy, welcome to the podcast. How are you?
Dr. Allison McCoy: I am great, thank you for having me.
Dr. Jerome Pagani: It's so nice to have another real™ doctor on the podcast.
Dr. Allison McCoy: The PhDs are the real ones.
Dr. Craig Joseph: And as we continue on with the actual part of the podcast where people are interested in learning things, I am interested in learning about you. So I understand that you went to college with the idea of becoming what I'm going to term a real doctor, a physician, a medical doctor, and then you saw the error of your ways. How did that happen?
Dr. Allison McCoy: This is true. Yeah. So when I went to college, I thought I might grow up to be a physician. My dad is a physician. My mom was a nurse. And then I took science classes and really did not enjoy it. So on a whim, I took some computer science classes and decided that was so much more fun and something I was really good at. So I switched my major to computer science, but still wanted to find a way to stay in healthcare because that's sort of how I had planned my whole life and sort of found out about informatics through a summer internship that I did. And that was it.
Dr. Craig Joseph: Excellent. And then so you, you were an undergrad. You were on football scholarship. Do I have that correct?
Dr. Allison McCoy: No. Academic scholarship. I went to, I went to Baylor and then my undergrad advisor sort of asked me if I was going to go to grad school after that and study Informatics. And I didn't really know that was a thing, but it sounded better than getting a real job at the time. So I applied for a masters and then was informed that if I got a PhD, it would be funded. So that also seemed like a good idea. So I went to Vanderbilt and got my PhD in biomedical informatics.
Dr. Craig Joseph: Awesome. And you remain at that venerable institution in Nashville. Is that true?
Dr. Allison McCoy: I do. I went to a couple of other places after I graduated. My husband didn't see the light and he is a physician. So I followed him around for a little bit. But then once we got to choose where we wanted to live and he was done with all of his medical training, we came back to Vanderbilt because I think it's the best place to do clinical informatics maybe in the world.
Dr. Craig Joseph: Wow. And that's a good lead into how I discovered your work. You were the lead author for the most famous article ever in JAMIA, and I define most famous as I just liked it and it was called clickbusters. And I just wanted to know, like, why do you want to bust clicks so badly? Like, what have clicks done to you that you want to bust them with such vigor?
Dr. Allison McCoy: So I have not been a victim of too many clicks personally, but I sort of started my career building clinical decision support alerts to help clinicians make better decisions when they're providing care. And it turns out that while a lot of these alerts have good intentions, some of them are not as well designed as they could be, and through some improvement could, we could fire them less and waste less time of clinicians. But finding those alerts that could be improved and finding ways to improve them, it takes a lot of effort. And so we came up with the idea for the clickbusters program, where clinicians and others at Vanderbilt could sort of adopt an alert, sort of like you adopt a highway. And we came up with a process for reviewing the alerts and coming up with improvements and then putting in those improvements to our production system. And then we evaluated it and we had a really good success in improving several of our alerts, reducing the clicks. And then as we've talked about before, actually having a set of users now who are really excited and partnering with us and helping us improve our EHR in general.
Dr. Craig Joseph: That's great. I love the idea of adopting an alert like you adopt a mile on the highway.
Dr. Allison McCoy: You do, yes.
Dr. Jerome Pagani: Do you get a little sign in the hospital with your name and the alert on it?
Dr. Allison McCoy: You should. You know, we did give framed certificates to all of our participants. And for the individuals who busted the most clicks, we got golden mouse trophies. So it's like a real physical trophy that looks like an old-school computer mouse. It's pretty nice.
Dr. Craig Joseph: I love that. And I think it might be an interesting idea to put a name in the EHR to an alert somehow so that, so that people could see like this isn't just, this wasn't generated by ChatGPT, like this was, this alert was created by a human, most likely a clinician had some input or an informaticist or both. And that's a, that's an interesting concept.
Dr. Allison McCoy: So we found throughout the process that a lot of our users don't realize that many of our most of our words are developed and at least improved a little bit here at Vanderbilt after we implemented our EHR. You know, most people think that Epic sent us these alerts and that we have no control over it. But it turns out we really can make a lot of changes. So that was something that a lot of our users learned throughout the process as well.
Dr. Jerome Pagani: This actually seems to be a great idea, both on the transparency side and in terms of harnessing the competitive nature, let's say, of most physicians.
Dr. Craig Joseph: Oh, it's, it's true. And I think it's, it's, you know, we talk about user-centered design, usability often. I could tell you that if I can associate an alert with a human, especially a human that I know, I'm going to be much more inclined to give this alert a look-see and not eliminate it immediately. Much like when I was a practicing physician, I would get calls at night and if I knew that patient or that parent, I'm a pediatrician. So I deal with parents mostly calling me. They had a different experience. I kind of had a different idea about them because I, they weren't this nameless, faceless being who was interrupting my sleep at 2:30 in the morning. And so I really do think that that's kind of one way of making things a little bit better.
Dr. Jerome Pagani: I was thinking of the highway sign idea more for the clickbusters for those who had, who had busted the most clicks. But I like your idea of personalizing the alert itself and giving it a little bit of personality there.
Dr. Allison McCoy: Yeah. Another thing that we do often with our alerts is we have feedback links where individuals who receive the alerts can click thumbs up or thumbs down if they liked it. And a couple of things with that, we found that, we actually respond to all of those feedback messages that we get, and some of the users are really surprised that a real person reads those and responds. But we've also talked about when those feedback messages come in that that feedback should go to the individual who requested the alert. And if they're not willing to get that feedback, then maybe they should reconsider the alert itself.
Dr. Craig Joseph: That's great. Kind of again, giving a personal responsibility to this, because as you mentioned, you're changing my mind. I didn't realize that clicks were, were so bad, but it sounds like unnecessary clicks are really bad.
Dr. Allison McCoy: Yes.
Dr. Craig Joseph: So that, that, that makes sense. Talk, can you talk a little bit about how you gamified the process? So certainly we just learned that you got a plaque or a framed certificate if you busted a lot of clicks. But I think you did something else. And it seems like right from the beginning. So I'm interested in hearing, like, was this a, did you, before you started on this process of click-busting, did you know that you were kind of going to gamify the situation or did this develop in the middle of it?
Dr. Allison McCoy: We did. So we took all of the alerts that we have in our system and assigned a score to each of them, and that was based on things like how often it fires to clinicians. How often were those fires interruptive? How many of those firings were overwritten? Because the idea is that interruptive alerts that are overridden more often are more annoying and should be busted more. We also looked at how complex the alerts were, so if they required a lot of different logic statements before they fired, that would take more effort to review than something that just had maybe one or two rules behind it. And so we took all of those and assigned a score. And then we also assigned scores for how far in the process the participants got. So did they just evaluate the alert? Did they come up with an approach to improving the alert? Did they actually implement that alert or those improvements into production? And then did they go in and evaluate how much those changes affected the number of clicks? Each of those had a score. So we sort of multiplied the number of steps times the score of the alert. And that was how we got their final clickbuster score. Very, very complicated Excel formulas.
Dr. Jerome Pagani: Allison, you talked a little before about how surprised folks were that actually you have a job that people could go in and modify the EHR. And you actually spend most of your time in the EHR trying to figure out ways to provide better care for the patient and make life easier for the clinician. So isn't that what the EHR vendor is supposed to do?
Dr. Allison McCoy: The vendor does a lot of that, but we have found, as many others in our field have found, every clinician does things differently and every hospital does things differently. And so it's important for someone at the institution to understand those individual workflows or idiosyncrasies or, you know, standard practices to customize it to each individual or setting as necessary.
Dr. Jerome Pagani: So I think we've heard you say that you don't consider yourself an expert in human-computer interaction. So how do you gauge the usability of a design of workflows and various options within the EHR or other IT?
Dr. Allison McCoy: So we do a few things when we get requests to build new things or evaluate interventions in the EHR. And one of the most important things we have found is to go watch and see how clinicians are actually using the EHR, especially when they're in their own settings. If they're in the ICU, it's busy. There's patients coding, and their priority is the patient, not necessarily using the EHR. And it's important to consider those things. We talk to people who are not just requesting something be built in the EHR but the individual who's going to actually be using it. We have a lot of cases where someone comes to us requesting an alert to someone else, but we have to make sure that that someone else actually needs and would benefit from that alert, not just the individual who's requesting it. And then we do have individuals at our institution who are experts in user-centered design and human factors. And so we partner with them a lot as well when necessary.
Dr. Craig Joseph: Dr. McCoy, you have something called the Clinical Informatics Core, and I'm interested to hear what that is. If it's similar to the Marine Corps, and it’s not, how so?
Dr. Allison McCoy: I don't think it's anything close to the Marine Corps, unfortunately. But we have found that there are a lot of requests that researchers at our institution have, either to get data out of our EHR or to build interventions in the EHR And our health IT operations team is really good at these, but they also have a lot of day-to-day operational requests that they need to fulfill. And so we have a team of analysts to work with our researchers to fulfill those requests to meet their research needs. So any request that needs data out of the EHR or to build into the EHR, our team can help them with that.
Dr. Craig Joseph: So how do you identify, you know, how is that different, I guess, from the way it was before?
Dr. Allison McCoy: Before the researchers would have to put in a request to the operations team. And those are often, sort of, got put on the back burner. As you know, if we were upgrading our EHR, there was a lot of work that needed to be done to build or, or fix things for that. And it wasn't a high priority to build this highly innovative intervention. And so we prioritized those. If it's a research request, that's what we do. We don't have to get focused on the operational needs.
Dr. Jerome Pagani: You sometimes take clinical informatics students out on field trips. You go to the local zoo, the fire station, or like your local data processing center? What, what are you doing there?
Dr. Allison McCoy: We go to a few different places. So Adam Wright and I teach a clinical informatics course at Vanderbilt for graduate students primarily. And we go visit settings where people are using the EHR. So we go to an inpatient unit, we go to an outpatient clinic, we go to the pharmacy or the lab, and we even go to the data center to see where all of our data are housed.
Dr. Craig Joseph: Why?
Dr. Allison McCoy: Well, like we talked about earlier, we don't understand how effective an intervention can be if we don't know how clinicians are using it in the real world or if we're using data that came out of the EHR. It's important that we and all of the students and researchers understand how that data got into the EHR.
Dr. Craig Joseph: Are you saying that there are different ways to do the same thing and sometimes that changes the way the analytics or reports are run or the information that they convey?
Dr. Allison McCoy: Yes, I am. It turns out that, you know, we have notes, for example, and we have a lot of students and faculty who do research on natural language processing on these notes. Some of those notes get typed in by hand using the keyboard. Some of them get dictated, some of them are generated using macros or stock phrases. And all of that affects what the texts look like, what the text is composed of. And I don't think we can do effective research using that information if we don't understand how it was created.
Dr. Jerome Pagani: So Allison, you do what I kind of consider to be sort of the holy grail of research. So as a basic research scientist, you know, everything that I did had to be caveated with “in rats” or “in mice.” And actually, I think a while back there was a Twitter account that would like find journal articles on PubMed and like if the titles were really, you know, like, oh, this finding and, you know, we've cured cancer. So it would retweet it with like “in mice” or “in rats.” But your research actually translates into actual outcomes for both clinicians and patients. Can you tell us a little bit about that?
Dr. Allison McCoy: Yeah, we try to as much as we can. That's one of the most exciting things to me about Applied Clinical Informatics is even though I'm doing my work on computers, we can see actual outcomes with our clinicians and patients. So, you know, with patients, for example, we put in an alert about prescribing naloxone with opioids and all of a sudden overnight we could see the rate of naloxone prescribing just skyrocketed. So not only did we go in and make that change for the alerts, we were able to go in and pull that data to evaluate it as well, which was really exciting. But we also can see the effect on clinicians. So, you know, when we made the change with clickbusters, we could see how much less often they were having to click on these alerts. So we were saving them time as well. And seeing that impact is, is my favorite part of my job.
Dr. Craig Joseph: And so your husband is a physician. You mentioned that. What is his opinion of your work?
Dr. Allison McCoy: I think he appreciates what I do. Every once in a while, he'll send me an alert that he gets. It's funny, like the naloxone alert. We were so excited when we put that in. And then a couple of days later, he sent me a message and he said, I've been getting this alert. And if I could turn it off forever, I would. I was like, No, that's the best alert we've ever built. But he had a really great point because he's a urologist and he would prescribe maybe for a vasectomy, like one benzo for before the procedure, and a couple of opioids for afterward. And that's not really enough to need naloxone or to, you know, have a high likelihood for, of an overdose for most patients. And so we actually went in and changed the alert and turned it off for situations where only one med, like one pill, was being prescribed. So he sort of had an inside source to improve the alert. And I think it made him happier.
Dr. Craig Joseph: That's, that's quite helpful to have someone on the inside. But I feel like that's a good story, kind of a good example of why you need to actually be out in the real world doing, you're doing your field trips because, you know, sometimes when you think about how someone's going to use it, you're thinking about this generic physician or you're focused on a hospital that's not realizing that, hey, there might be non-hospitals actually taking care of hospitalized patients and their needs and their particular patient's needs might be slightly different. And I think that's a, it's kind of helpful. You have a, I don't want to call your husband a guinea pig, but let's just call him a guinea surgeon and something like that. And yeah, I, I think that's a great thing.
Dr. Craig Joseph: And if you, I'm not sure, I'm not saying that you married him because you were looking for an easy person to give you feedback. But if so, my compliments.
Dr. Allison McCoy: Thank you.
Dr. Jerome Pagani: See, as a true doctor, she understands the value of both basic research and applied. See, that's really what it was.
Dr. Allison McCoy: It’s true.
Dr. Jerome Pagani: Allison, sometimes you develop a product for one group of stakeholders within the hospital, but then it turns out it doesn't work because there's sort of a larger ecosystem that surrounds that particular group. Can you give us an example of where you tried to really move the needle for somebody, but it sort of ended up falling short because of that kind of a situation?
Dr. Allison McCoy: Yeah. So actually as my dissertation project, I developed a dashboard that a clinical pharmacist used to monitor patients who had acute kidney injury, who also had an order for a nephrotoxic or renally cleared medication. And this was extending some work that I did for my master's project where I built a clinical decision support alert to remind clinicians about those patients. And we did a randomized trial. We were really excited about it. And then at the end of the study, we found absolutely no effect on the outcomes. We had no improvement of adverse drug events in those patients. And it turned out that while we were building that dashboard, we actually did a lot of work to improve our clinical decision support alert. So it was working so well that there wasn't any extra room for the pharmacists to improve. And so the good news is they still had some lessons learned. I still got my PhD, but it sort of changed the trajectory of my career into focusing on how do we deliver better, better clinical decision support so that we're not trying to come up with these other use cases or wasting other people's time.
Dr. Jerome Pagani: So would you rank clinical decision support tools ahead of some of that dashboarding then?
Dr. Allison McCoy: I think it just depends. So for some scenarios, a dashboard is more appropriate. If you need to see a large group of patients at a time. For example, surveillance of patients and risk scores who are likely to have certain high-risk events, a dashboard might be more appropriate. But in the case of the decision support alert, it was more appropriate for the clinicians to see it. But the reason they were overwriting it so often is because they were cases where it was inappropriate and that's where we were able to go back in and fix those.
Dr. Jerome Pagani: What is your research focused on recently that you could tell us about?
Dr. Allison McCoy: Most of the research I have done lately has been either through the Core or the clickbusters work, so I'm still really interested in finding better ways to improve the decisions of what we're providing, how we better evaluate it to make sure that we're all evaluating it the same way and consistently, and then partnering with other clinicians in the Core to figure out how we take all of the exciting clinical research and put it into the EHR.
Dr. Jerome Pagani: There's been a lot of buzz lately about generative AI, which seems to be sort of coming into maturity in ChatGPT specifically, what can you tell us about what role that might play, and in what kind of time frame, in clinical decision support?
Dr. Allison McCoy: So we actually have a paper out that's in preprint that a postdoc working on our team, Sierra Lu, worked on and she used ChatGPT to generate suggestions to improve clinical decision support. And we actually compared those responses that ChatGPT generated to the ones that our participants generated through clickbusters and ChatGPT actually did a really good job. So I think it's one more approach that we could use to find ways to improve our decision support.
Dr. Jerome Pagani: Ready for prime time or are we still sort of in the feeling things out phase?
Dr. Allison McCoy: I think we're still feeling things out. I don't think it will ever totally take over our jobs. All of the suggestions still needed some manual review. It made up some medications, for example, but there were definitely some ideas that it came up with that our participants didn't. So I think it's a good combination.
Dr. Jerome Pagani: Will it change the nature of what you do or how you approach your job? So you spend more time, let's say, in review, and less time in sort of the generating or suggesting?
Dr. Allison McCoy: That's a good question. I don't know. It's possible.
Dr. Craig Joseph: It's interesting. It's an interesting kind of trend. One could imagine a generative AI who, which has acc- I almost said who- which has access to patient-specific information and clinician-specific information. One could imagine alerts coming to, again, both patients and clinicians, with slightly different, you know, the length of the alert, the, the way it comes towards you, the point in the workflow that it comes at you, because it sees, oh, some people are much more inclined to make a change if it's presented in this certain way or the certain part of the workflow. And it's kind of scary to think about that, but that, you know, no one would see the exact same alert. However, clearly, as you mentioned, we'd have to make sure that that the AI is sticking in these very specific railroad tracks to not kind of get on either side to make up stuff. That would be awkward.
Dr. Allison McCoy: Yeah, I think that's a great use. I am hopeful that eventually we will have AI helping us deliver better clinical decision support because, again, you know, right now we mostly use simple rule-based decision support and it's hard to come up with the rules that are relevant for all patients, all clinicians in every scenario. But AI is really good at that, and I think if we can get it to a better place, applying it to deliver more specific decision support would be a really great use.
Dr. Jerome Pagani: So Allison, how close are we to having AI be involved in actual clinical diagnosis or other clinical tasks beyond just decision support?
Dr. Allison McCoy: I think we could be really close to having an AI help generate clinical hypotheses that then physicians or the appropriate clinician would go in and review. I, I think that people today are getting more comfortable with AI. A lot of us are using ChatGPT. But most of us don't even realize the AI that we use, like when we depend on Google Maps to get home and find us the fastest route. I don't think we're in a place that I will say yes, this is for sure the diagnosis, but I, I'm hoping we can get pretty close to using AI to make some things easier.
Dr. Jerome Pagani: Beyond the regulatory end, how, what are the sort of barriers or hurdles we're looking at to actually get there?
Dr. Allison McCoy: It can be hard to implement a lot of these applications into systems. So like I have a lot of research colleagues who are developing really great predictive models for various scenarios, but not all of them get integrated into the EHR. So one of the things I've been working on with the Core is getting predictive models into our EHR and in front of clinicians so that we can actually use them and measure real-world outcomes with them.
Dr. Jerome Pagani: Allison, we like to ask this question of everybody who comes on the podcast and just kind of get a sense of their experiences with design in sort of the quote-unquote real world. So what are three things that are so well designed that they just bring you joy to interact with, and they can be outside of the healthcare system?
Dr. Allison McCoy: So one of my favorite apps that I use all the time is the Publix app on my phone. So I use it to create my shopping lists. And it doesn't matter which Publix I go to, it's sorted by where the items are in the store, like, which aisle I need to go to. And it makes my life so much easier. And then I just tap it and it removes it. And if there are things on sale that I buy frequently, it reminds me about those. So that's one of my favorites. My other favorite, and you're going to laugh, but I really like Epic because they do such a good job training individuals how to build interventions into their system, how to get data out of their system. They have great tools for finding what you need, either through their user web or the individuals who work there. And the training is really fun. I enjoy going up there every year and eating their delicious food. I don't know if I have a third, but those are my top two favorite things.
Dr. Jerome Pagani: This was so great. Thanks so much, Allison.
Dr. Craig Joseph: Yeah, thank you.
Dr. Allison McCoy: Thanks for having me.