Headlines screaming “Technology is ruining doctors’ lives” are hard to miss. Despite this, the use of digital tools in healthcare continues to rapidly expand. Gone are the days when the average physician could leave discussion of the design and implementation of health technology in the hands of the few tech-savvy colleagues. Just as physicians need a basic understanding of radiology, they also need a basic understanding of clinical informatics. They need to effectively navigate the vast quantities of data now inherent to medical practice AND participate in the co-design of technology in medicine. The first is essential to deliver excellent patient care, and the second is to counter physician burnout.
An exact definition of clinical informatics can be difficult to pin down. The American Medical Informatics Association (AMIA) suggests that the role is to “transform health care by analyzing, designing, implementing, and evaluating information and communication systems that enhance individual and population health outcomes, improve patient care, and strengthen the clinician-patient relationship.” I’m going to suggest that, for the average physician, a simpler, more relevant definition of clinical informatics as a discipline is to “best position a healthy physician–technology relationship.”
For the holdouts who do not have (or want) a relationship with technology, the truth is that this is no longer a viable option. In medicine, there is already a real and very necessary relationship with technology. The real question is, who gets to define what that relationship looks like? I would posit that if you aren’t jumping up and down volunteering for this role, you are giving up your right to complain when negative consequences arise. It’s hard to have influence without a seat at the table. And that is part of the problem: Clinical informatics has been absent from many important strategic decisions.
Clinicians who have a background in informatics and are actively involved in the design and implementation of health technology can ensure that the technologies implemented align with their specific needs and keep the focus on patient care. Think of how much more effectively you can advocate for change to an unhelpful interruptive alert in your electronic health record if you have a basic understanding of the principles of good clinical decision support. This will transform the unhelpful helpdesk ticket submission of "turn off this stupid alert" into the actionable "this alert fires too soon in my workflow. Can the criteria be changed so it fires when I am signing orders but have not included the order for X? I routinely order X, so it is not helpful to fire when I enter the chart but would be helpful on the rare occasions when I forget to order X."
A solid background in clinical informatics equips clinicians with the knowledge and skills to effectively analyze and interpret patient data. This enables them to make better-informed decisions, leading to improved patient outcomes. By understanding how to leverage technology for data-driven insights, clinicians can provide more accurate diagnoses, personalized treatment plans, and proactive preventive care. Much as radiology has applied technology to improve diagnosis and enhance care, a basic understanding of clinical informatics is the necessary foundation for the average physician to effectively use health technologies. For example, if the hospitalist is unaware of their health system’s sepsis screening algorithm, they lose the opportunity to use it for rapid response or preventive measures. Likewise, if they don’t understand the algorithm’s limitations, they may accept the sepsis alert in error.
As my colleagues Thomas O’Shaughnessy (President, Healthtech Consultants) and Laura Copeland (CMIO, Healthtech Consultants) pointed out in an article about the Canadian primary care system, Canada has an urgent need for more investments in technology and a larger workforce. Together those factors create a burning platform for more physicians with a background in clinical informatics. And not just for today. Healthcare technology is constantly evolving, with new innovations emerging regularly. Clinicians with a background in clinical informatics are better equipped to adapt to these advancements and harness their potential benefits. They can stay informed about emerging technologies, such as artificial intelligence, machine learning, telemedicine, and wearables, and assess their relevance and impact on both patient care and physician workload.
So where to start? In the U.S., clinical informatics has been a medical subspeciality since 2011. In Canada, it remains a challenge for the average physician to easily access educational materials in clinical informatics, but there are a number of ways to get started. Seek out and get to know any clinical informatics physician champions at your organization. They can be a good resource to answer questions. Review Harold Lehmann‘s article, “The Informatics Stack: A Heuristic Tool for Informatics Teaching.” He lays out a good graphical depiction of the factors that affect the usefulness of a piece of technology, with a dividing line between those things that can be corrected with technical fixes and those that need operational, workflow, or other non-technical fixes. Audit a course in Coursera for free (there are also paid options if you want a certificate). Or take a course through the American Association of Medical Informatics (AMIA) (price range 200-400 USD), the AMIA 10 x 10 courses are a good starting place. Harold Lehmann once said, “Health IT is what you can do. Informatics is what you should do.”