Medical Ethics Advisor – April 1, 2023
April 1, 2023
View Issues
-
Scientific Journals Confront Ethical Controversy Over ChatGPT
The new artificial intelligence (AI) tool ChatGPT is roiling the scientific community. There are a range of ethical concerns in relation to the use of AI for journals. There is risk of bias, inaccuracy, and authorship issues.
-
Remain Cautious When Using Chatbots to Provide Mental Healthcare
Consider these common, innocuous questions: Are you taking your medications as directed? You look a little upset — is everything OK? Do you need some urgent help? Now, consider if an AI tool, not a human therapist, asked a patient these questions, along with the issues of trust, privacy, and bias. Can humans and machines establish a bond?
-
For Sale: Private Mental Health Data and Consumer Trust
Once people learn mental health data can be sold or misused, trust erodes. These issues might dissuade people from seeking care online or via an app. For many, that may be their only option.
-
Direct-to-Consumer TV Ads Push Drugs of Scant Therapeutic Value
Many people assume TV ads for prescription drugs are for new, cutting-edge medications that represent groundbreaking advances. However, there is growing evidence suggesting otherwise.
-
Aggressive End-of-Life Care Remains Common, Especially in Nursing Homes
Recent research findings raise ethical questions about how patient or family preferences are communicated to care providers, the timing of those discussions, and what policies are in place at the nursing home to honor patients’ goals of care.
-
Spiritual Support Alleviates Anxiety of Surrogate Decision-Makers
Surrogates enrolled in an enhanced spiritual care model reported less anxiety, more spiritual well-being, and greater satisfaction with spiritual care compared to surrogates who received usual care. These results suggest expanded chaplain involvement is beneficial.
-
Ethics Skills Align with Trauma-Informed Care Principles
Ethics consults often center on traumatic situations — for patients, families, and even the clinicians who are providing treatment. Trauma-informed care transforms questions about what is wrong with someone by adding more context, such as discovering information about what happened to the patient.
-
Tool Identifies Patients in Need of Serious Illness Conversations
Text messages generated by a machine-learning tool resulted in clinicians engaging in more serious illness conversations with high-risk patients.
-
Ethical Approaches Needed for Social Needs Screening
Healthcare providers are screening patients and families for social needs, including housing instability, food insecurity, and difficulty paying for utilities. The idea is clinicians then can connect patients in need with appropriate community resources. However, there could be some unintended consequences.
-
Bias and Stigma Hinder Effective Obesity Treatment
The industry is moving away from a hierarchy of care where a primary provider tells the patient what they ought to do. Instead, the model is moving toward shared decision-making.
-
Researchers Offer Tips to Improve Shared Decision-Making in Pediatrics
Sometimes, all that is communicated to parents was the physician’s recommendation of what to do, not that there were several options to choose from and why one particular option is what the clinician preferred. This suggests physicians could benefit from additional guidance to promote the appropriate use of shared decision-making.
-
Research Ethics Consultation Service Is a Growing and Evolving Program
After a decade in existence, the most frequent reason for requesting these services is questions about study design, followed by informed consent.