Triage: When Relying on Historical Data, Do Not Apply Bias from Past Decisions
By Dorothy Brooks
There is no denying that in a system that relies heavily on clinician judgment regarding acuity designations, bias can influence triage decisions. Indeed, among the disparities identified in the study into Emergency Severity Index (ESI) triage accuracy conducted by Dana R. Sax, MD, MPH, and colleagues revealed that Black patients had a 4.6% greater relative risk of overtriage and an 18.5% greater relative risk of undertriage when compared with white patients.1
While there is hope that clinical support tools that use artificial intelligence (AI), machine learning, and other technologies to make triage recommendations can provide the silver bullet needed to stamp out bias, experts tell ED Management that while these newer technologies can help mitigate bias, there are no easy solutions.
“There is a lot of concern that machine learning can propagate existing bias within data sets. If we’ve been making decisions that benefit one racialized group over another for decades or the past year, and we use those data to train an algorithm, we can expect that the algorithm will make similarly biased decisions,” explains Jeremiah S. Hinson, MD, PhD, an associate professor of emergency medicine at the Johns Hopkins University School of Medicine and co-director of the Center for Data Science in Emergency Medicine. “The algorithms have no conscience. They are just going to be trained on what we’re doing.”
That is why the algorithm behind the machine learning-driven clinical support tool that Johns Hopkins researchers developed to optimize triage decisions in the emergency department does not include any indicators for race or ethnicity. “We do not want to pick up on any of those [biased] patterns and then apply them to future groups,” Hinson says.
However, even excluding variables such as race and ethnicity may not completely root out bias, Hinson explains. For example, he notes that if one ethnic group has more missing data from their problem list than their white counterparts, you may get more accurate predictions for white patients than for patients of that ethnicity.
“What we have done recently is developed a method to interrogate our models and determine whether we are making predictions that are as accurate for one group as for another,” Hinson says. “What we have found is that in our hospital [system], our predictions are equally accurate.”
Nonetheless, it is a concern that the developers and users of triage algorithms — or other decision support tools that rely on historical data — need to consider. “That is part of our model development evaluation process every time we deploy,” Hinson says. “It is something we do as a standard to make sure we are predicting as well with one cohort as for another.”
REFERENCE
- Sax DR, Warton EM, Mark DG, et al. Evaluation of the Emergency Severity Index in US emergency departments for the rate of mistriage. JAMA Netw Open 2023;6:e233404.
There is no denying that in a system that relies heavily on clinician judgment regarding acuity designations, bias can influence triage decisions. Indeed, among the disparities identified in the study into Emergency Severity Index triage accuracy was that Black patients had a 4.6% greater relative risk of overtriage and an 18.5% greater relative risk of undertriage when compared with white patients.
Subscribe Now for Access
You have reached your article limit for the month. We hope you found our articles both enjoyable and insightful. For information on new subscriptions, product trials, alternative billing arrangements or group and site discounts please call 800-688-2421. We look forward to having you as a long-term member of the Relias Media community.