Liability Exposure for Vendor ‘Extremely Difficult’ if AI Tool Used in ED
EDs are using many new tools to support clinical decision-making, including artificial intelligence (AI). “AI has the chance to revolutionize ED practice, which can be a chaotic atmosphere,” says Samuel D. Hodge, Jr., JD, professor of legal studies at Temple University.
Multiple recent studies have demonstrated the benefits of AI tools in the ED setting, particularly for radiology and clinical decision-making.1-5 “However, numerous issues need to be considered,” Hodge cautions.
In terms of malpractice liability for providers, hospitals, or vendors, some important questions include: What happens if the AI tool is incorrect and the physician relies on the results? Can the emergency physician (EP) escape liability if the AI tool is faulty? Will malpractice insurance policies cover litigation involving AI tools used in the ED? The answers to these questions remain mostly unclear. “The use of technology in the ED is still in its infancy. The issues have not been fully litigated,” Hodge explains. “If I were going to implement AI technology in an ED, I would want an indemnification agreement from the software company.”
If something goes wrong and the EP relied on an AI tool for decision-making, Hodge says, “there is no question that the physician, hospital, and technology company will be sued. Each will file cross-claims against the other.”
An indemnification agreement could shift the responsibility to the tech company, if there was an error in the software. “Enforcement of indemnification agreements is a matter for a court to determine,” says Kenneth N. Rashbaum, JD, a partner at New York City-based Barton LLP.
Indemnification clauses are standard provisions in AI license agreements, but states vary as to enforceability criteria. “Indemnification agreements may be narrow or otherwise restricted,” says Rashbaum, who has litigated enforceability of indemnification provisions in multiple cases.
Indemnification clauses can be restricted to certain claims (e.g., intellectual property) and would not apply to other claims like malpractice. The clauses also can be restricted in many other ways (e.g., dollar amounts, limits on available and applicable insurance, and time frames). “Enforcement of contracts depends upon the law of the particular state and the criteria of a particular judge,” Rashbaum notes. “Liability limitations in litigation are dependent upon many factors.”
In any case, the EP must use a reasonable standard of care in applying the results the AI tool provides. Juries probably will not accept that the EP blindly relied on an AI tool, and will expect the EP to rely on clinical judgment. “This is going to be the sticking point that will be litigated. It could be that they all end up being joint tortfeasors and a jury will have to assign a percentage of liability to each,” says Hodge, adding the litigation could turn into a “blame game.”
In a case like that, the EP might argue the AI tool pointed in the wrong direction (e.g., to a cardiac problem). The plaintiff could counter that the EP should have considered other factors that pointed away from that (e.g., history, risk factors, or physical exam). “Plaintiff’s counsel rarely limit their cases to one argument,” Rashbaum says.
For the EP defendant, it is tempting to argue the AI tool caused a misdiagnosis. “But attributing liability to an AI provider would be difficult and could carry significant risks for the defense,” Rashbaum warns. The main reason is an AI tool is meant to assist the EP, not provide a substitute for reasoned clinical judgment. “Blaming the AI tool is somewhat analogous to blaming a textbook that the clinician consulted during treatment. A jury would most probably be unimpressed with such a defense and may be hostile to it,” Rashbaum explains.
The licensing agreement with the AI provider probably would include strong disclaimers of liability. “This would make it extremely difficult to attribute fault to the AI provider in a treatment setting, especially in an ED,” Rashbaum says.
Disclaimers often are the subject of contentious negotiations during litigation. Generally, courts will enforce the language of the contract if the parties are of equal commercial bargaining strength, if the provision does not violate public policy (which varies by state) or existing law, and the provision is written clearly to indicate the intent of the parties, according to Rashbaum.
A potential exception: If the defense team can prove the AI tool was faulty because of the data used to create the algorithm. For example, defense lawyers might hire expert witnesses from the IT field who testify the tool omitted data from representative populations such as age, gender, or race. Even so, it would be an uphill battle for the defendant to deflect liability in this manner. “The clinician and his or her defense team should weigh the advantages and disadvantages of bringing such a technically dense defense before a state court jury that may view the defense, to the extent they can understand it, as blame-shifting,” Rashbaum says.
EPs may wonder if they should document the use of an AI tool in the medical record. “There is rarely any advantage in documenting a reference to a textbook or research paper, and reference to use of an AI tool is no different,” Rashbaum reports.
In fact, documenting the fact an AI tool was used could open new areas during the EP’s cross-examination. “These would not play to the strengths of the clinician defendant,” Rashbaum cautions.
For example, plaintiff attorneys might ask the EP: What other factors did you consider in reaching a diagnosis, ordering tests, or providing treatment? Did you over-rely on AI to the exclusion of other necessary elements, such as medical history, history of present illness, or presenting symptoms? “AI provides probabilities, not diagnoses,” Rashbaum says.
Skillful cross-examination could convince a jury the EP made a mistake and is blaming the AI tool for it. “Negative consequences in the trial could result, including potential increase in the amount of damages awarded because the jury disliked the scapegoat strategy,” Rashbaum explains.
REFERENCES
- Jalal S, Parker W, Ferguson D, Nicolaou S. Exploring the role of artificial intelligence in an emergency and trauma radiology department. Can Assoc Radiol J 2021;72:167-174.
- Gorelik N, Gyftopoulos S. Applications of artificial intelligence in musculoskeletal imaging: From the request to the report. Can Assoc Radiol J 2021;72:45-59.
- De Hond A, Raven W, Schinkelshoek L, et al. Machine learning for developing a prediction model of hospital admission of emergency department patients: Hype or hope? Int J Med Inform 2021;152:104496.
- Sills MR, Ozkaynak M, Jang H. Predicting hospitalization of pediatric asthma patients in emergency departments using machine learning. Int J Med Inform 2021;151:104468.
- Wei Tang KJ, En Ang CK, Constantinides T, et al. Artificial intelligence and machine learning in emergency medicine. Biocybernetics and Biomedical Engineering 2021;41:156-172.
Multiple recent studies have demonstrated the benefits of artificial intelligence tools in the ED, particularly for radiology and clinical decision-making. However, numerous issues need to be considered.
Subscribe Now for Access
You have reached your article limit for the month. We hope you found our articles both enjoyable and insightful. For information on new subscriptions, product trials, alternative billing arrangements or group and site discounts please call 800-688-2421. We look forward to having you as a long-term member of the Relias Media community.