Tuesday, January 21, 2025

Missing Essay on "What to say" in the most difficult situations

 Some in 2024, I think in mid-2024, I saw an essay by a physician in one of the major journals (I think.  I think, JAMA or NEJM or Ann Intern Med, etc).

The essay was about what to say in difficult situations.  In might have been triggered by a study that found that AI dialog in difficult situations (e.g. terminal diagnosis) was as emphathic as spontaneous human remarks and statements.

As I recall, the author made the point, that he had learned in his training, that in the most difficult situations, there were actually a fixed library of statements, that were generally "better" to use than simply making comments off the top of his head and hoping for the best via spontaneity.

I resonated with the physician's essay, but have not been able to find the citation again.

###

I have not reviewed the material below, but I fed the above paragraphs straight into Chat GPT 4, click on the "web" search button, and got this result.

####

Chat GPT 4 (Web search)

You might be referring to the study "Comparing Physician and Artificial Intelligence Chatbot Responses to Patient Questions Posted to a Public Social Media Forum," published in JAMA Internal Medicine in April 2023.

This research found that chatbot-generated responses were preferred over physician responses and rated significantly higher for both quality and empathy. The findings suggest that AI assistants could aid in drafting responses to patient questions, potentially alleviating some of the communication burdens faced by physicians.

Following this study, several commentaries and responses were published discussing the implications of AI-generated empathy in medical communication. For instance, the article "Machine-Made Empathy? Why Medicine Still Needs Humans"

and its subsequent reply

delve into the nuances of AI-generated empathetic language and its role in healthcare. These discussions emphasize the importance of maintaining human oversight and the irreplaceable value of genuine human empathy in patient care.

Additionally, the viewpoint "Compassionate Machines: The Ethics of 'Artificial Empathy' in Cancer Care," published in JAMA Oncology in May 2024, explores the ethical considerations of integrating AI-generated empathy into oncology practice.

This piece highlights the potential benefits and challenges of using AI to simulate compassionate communication in sensitive medical contexts.

These publications collectively address the evolving role of AI in medical communication, particularly concerning the use of standardized empathetic responses in difficult situations.


###

None of the above were exactly what I was looking for.   Adjacent to the above, JAMA suggested, "physicians and AI chatbot responses...Chen et al., JAMA Oncol 2024.

https://jamanetwork.com/journals/jamaoncology/fullarticle/2818765

###

Notes - resources found; none are the first-person essay that whether AI or memorized, there were certain time-tested based statements to say in a difficult situation.

###

Comparing physician and artificial intelligence chatbot responses to patient questions...

Ayers JAMA Intern Med 183:589

Machine-made empathy.  Why medicine still needs humans.

Cadiente JAMA Intern Med 183:1278

Machine-made empathy.  Reply.

Ayers JAMA Intern Med 183:1279.

Compassionate machines: Ethics of artificial empathy in cancer care.

Koranteng  JAMA Oncol 10:847

Physicians and artificial intelligence respones to cancer questions from social media.

Chen  JAMA Oncol 10:956





No comments:

Post a Comment

Note: Only a member of this blog may post a comment.