Hospital Sharply Reduces CDS Alerts to Address Clinician Concerns
Like many hospitals and health systems, Indiana University (IU) Health in Indianapolis had put a robust clinical decision support (CDS) system in place at its 16 hospitals across Indiana, with expectations that it would improve quality of care and ease the burden on busy clinicians.
IU Health leaders realized its CDS system was producing the opposite effect. Too many alerts were interfering with the proper administration of care, leading to extreme frustration among clinicians.
The problem creeped up over several years as IU Health strived to make its electronic health record (EHR) more useful in improving patient safety, says Jason T. Schaffer, MD, MBI, associate chief medical information officer at IU Health. Like many systems, IU Health responded to adverse events by identifying root causes and the ways they could be prevented.
“We would look at ways to keep that from happening again. In many cases, that resulted in a clinical decision support tool or an alert. We do that again and again, not understanding what happens across the system when we add these alerts that get in the way or frustrate our doctors,” Schaffer says.
The problem can be measured in different ways. One measure is the number of alerts in the CDS system. Another is the number of times a rule fires an alert. Analysts also can measure the response rate, which is how many times an alert changes a clinician’s action.
Another useful measure is the “blow-by” rate, which indicates how often a clinician ignores or refuses an alert. It also is useful to measure how quickly users blow by an alert.
“It’s almost like muscle memory on some of these alerts,” Schaffer says. “An alert pops up, and the user dismisses it immediately to get on to the next step.”
After measuring those factors, IU Health determined that in May 2017, the CDS system fired 2.5 million alerts that month, an all-time high. “It was recognized that we had gone way overboard. Our blow-by rate ... was above 90%, which means we were frustrating our physicians quite frequently with alerts that didn’t mean enough to them to actually follow what the alert was telling them to do,” Schaffer says. “We knew we had to do something about it.”
First, IU Health considered conducting a literature search on how to streamline CDS alerts or using an outside vendor who could bring in software to analyze the alerts and the rules that generated them. But Schaffer says IU leaders thought the situation was dire and demanded immediate action.
“We started pulling out alerts at a relatively fast rate. We looked at our own numbers, but didn’t do a lot of analysis,” he says.
100 Rules Deleted
IU Health immediately removed more than 100 rules that drove alerts over eight months, which quickly dropped the number of alerts to less than 800,000 per month. That still is a high number, but reasonable considering the number of patient encounters throughout the health system.
“We followed our quality and safety metrics along the way and did not note any significant increase in the number of safety events, either events related to a specific alert that we pulled out or safety events in general throughout our system,” Schaffer reports. “As you might imagine, our quality and safety teams were nervous through this time.”
Because the health system was removing rules so quickly, it was unable to measure the effect from each one. The quality and patient safety teams watched a few rules deletions more carefully because of the potential for a severe increase in adverse events. There was no spike after any rule deletion.
There was some skepticism about removing the rules. Schaffer and his colleagues met with quality and safety leaders at IU Health to discuss specific rules and the reasoning for deleting them. Long conversations were necessary to assure them early on and along the way that the changes could be made without threatening patients.
They started out by looking at about 10 rules, and watched the effects for a month, reviewing the data with the quality and safety teams. No negative effects could be seen.
“With that confidence, we started doing more groups and larger groups of alerts. The most we removed at one time was about 30,” Schaffer recalls.
An example of a retired alert is “72 Hour Enteral/Parenteral Nutrition,” indicating a patient had not received a diet order within the past 72 hours. IU Health retired this alert at the request of the risk management department because those staffers said the alert was not working as designed.
IU Health kept some alerts but turned them into passive messages that do not interrupt workflow. Instead, those now appear as sidebar messages when the clinician opens the chart.
One of those is “Department of Child Services (IN 310 — State Form) Alert.” This instructs the clinician not to discharge the patient until such action is indicated to be safe. Another is “Care Contract Alert,” which alerts the clinician to review the patient’s existing care contract in the EMR.
As the number of alerts decreased, so did the blow-by rate. From the beginning, when clinicians were ignoring 90% of alerts, that figure fell to about 75% with the first culling of the unnecessary notices. That lower figure is better, but still not great, Schaffer says. A figure of about 60% would be closer to ideal, he adds.
“If the alerts were perfect, you wouldn’t need doctors and nurses,” Schaffer says. “Alerts are there to drive some actions, but there has to be an interaction with a clinician who is taking care of the patient. We still have work to do, but our data are much better than it used to be. It speaks to frustrating doctors less.”
Schaffer also is an emergency physician. Currently, he is researching an alert his colleagues brought to his attention. It has a blow-by rate of 95% and is causing frustration in the ED. Schaffer is assessing the validity of the underlying rule.
The fact his colleagues brought it to his attention is a good development, Schaffer says. Previously, clinicians might be annoyed by an alert but never bring it to the attention of leadership because they did not think anything would happen. Or, if they did complain about an alert, clinicians would do so in a different way, complaining about the volume of alerts rather than isolating one in particular.
“Now, they focus on this one alert because it stands out to them. The electronic record feels different than it did before we started this improvement,” Schaffer says. “Instead of being overwhelmed by so many unnecessary alerts, they can recognize that this one alert keeps coming up. It used to be that there were so many alerts you would just click past them all the time. Now, they’re rare enough that our clinicians pay attention to them and either find them useful, or in this case, they can recognize that this one needs to be reassessed.”
Clinicians also spend more time reading alerts. Because the notices are more meaningful, users stop to consider them instead of automatically dismissing them.
Still Some New Rules
The improvements did not mean IU Health could stop adding alerts to its CDS system. As new issues develop and best practices change, there will be a need to introduce alerts. However, IU Health takes a much more considered approach today, evaluating a proposed new alert with the help of a vendor to ensure it is introduced in the most helpful way, Schaffer says.
Much of that work is based on a campaign created by American Board of Internal Medicine (ABIM) Foundation called Choosing Wisely, which seeks to avoid unnecessary medical tests, treatments, and procedures.1 For example, an alert might discourage ordering plain film X-rays for back pain or head CT scans for syncope, Schaffer explains.
IU Health entered 25 new rules based on this campaign guidance to avoid routine orders that might be outdated or unnecessary. They increased the total number of alerts in the system by only about 5,000 per month.
“We entered those at the same time that we removed some other alerts,” Schaffer reports. “We found that you can actually do alerting in a smart way and not create the frustration that we had in the past by basing it on a single event.”
Rules Are Not Panacea
Schaffer says an important lesson from this experience is avoid basing rules on single events. IU Health, like many others, had gotten in the habit of conducting a root cause analysis for adverse events and introducing a new CDS rule to keep it from happening again.
“We rely sometimes too much on the technology to put a rule in place so that this never happens again. We think that is finishing the job when actually we didn’t follow up with any education, and just created another alert that clinicians blew by,” Schaffer says.
The same applies to documentation requirements in the EHR. Administrators might identify a need for documentation and add it as a mandatory field in the EHR, thinking the clinician must fill in those data before proceeding. “You’d be amazed at the number of ways clinicians find workarounds to mandated things. With alerts, we just blow by them. With mandated documentation, we just put a dot in the field, and the field is then satisfied so we can move on to the next thing,” Schaffer says. “While the EHR is a great tool to help with quality, safety, and documentation, it’s not the complete answer. Education is still necessary so that clinicians understand why the alert is there or what you’re trying to achieve with that documentation.”
REFERENCE
- ABIM Foundation. Choosing Wisely.
SOURCE
- Jason T. Schaffer, MD, MBI, Associate Chief Medical Information Officer, Indiana University Health, Indianapolis. Phone: (800) 248-1199.
Indiana University Health realized its clinical decision support system was overwhelming clinicians with alerts. Read on to learn how leaders acted to improve the process.
Subscribe Now for Access
You have reached your article limit for the month. We hope you found our articles both enjoyable and insightful. For information on new subscriptions, product trials, alternative billing arrangements or group and site discounts please call 800-688-2421. We look forward to having you as a long-term member of the Relias Media community.