More people are turning to mental health AI chatbots. What could go wrong?


Chatbots are replacing conversation therapy

The accessibility and scalability of digital platforms can significantly reduce barriers to mental health care and make it available to a broader population, said Nicholas Jacobson, who researches the use of technology to improve the assessment and treatment of anxiety and depression at Dartmouth College.

Swept up by a wave of generative AI, tech companies were quick to cash in. Dozens of new applications, such as the WHO’s “digital health worker,”Sara“They offer automated counseling, where people can participate in cognitive behavioral therapy sessions (a psychotherapeutic treatment that has been shown to help users identify and change negative thought patterns) with an AI chatbot.

Feed their curiosity with your gift

The advent of AI, Jacobson adds, will enable adaptive interventions and allow healthcare providers to continuously monitor patients, anticipate when someone may need support and provide treatments to alleviate symptoms.

It is not anecdotal either: a systematic study review of mental health chatbots found that AI chatbots could dramatically reduce symptoms of depression and distress, at least in the short term. study They used AI to analyze over 20 million text conversations from real counseling sessions and successfully predicted patient satisfaction and clinical outcomes. Similarly, other studies They have been able to detect early signs of major depressive disorder from spontaneous facial expressions captured during routine phone unlocks and people’s writing patterns.

More recently, researchers at Northwestern University devised A way to identify suicidal thoughts and behaviors without psychiatric records or neural measurements. Their AI model estimated the likelihood of self-harm in 92 out of 100 cases based on data from simple questionnaire responses and behavioral cues such as rating a random sequence of images on a seven-point like/dislike scale from 4,019 participants.

  After knee replacement, keep these things in mind before practicing yoga again

Two of the study’s authors, Aggelos Katsaggelos and Shamal Lalvani, hope that once the model passes clinical trials, it will be used by specialists as a support, for example to schedule patients based on perceived urgency, and eventually rolled out to the public in home settings.

But as was evident in Smith’s experience, experts urge caution in treating technological solutions as panaceas because they lack the skills, training and experience of human therapists — especially generative AI, which can be unpredictable, fabricate information and regurgitate biases.

Where artificial intelligence fails

When Richard Lewis, a counsellor and psychotherapist in Bristol, tried Woebot (a popular script-based mental health chatbot that can only be accessed through an associated healthcare provider) to help with an issue he was also exploring with his therapist, the bot failed to pick up on the nuances of the issue, suggested he “stick to the facts” while removing all emotional content from his responses, and advised him to reframe his negative thoughts as positive.

“As a therapist,” Lewis said, correcting or erasing emotions is “the last thing I would want a client to feel and the last thing I would suggest.”

“Our job is to form a relationship that can hold difficult emotions,” Lewis added, “and feelings for our clients, so that it is easier for them to explore them, integrate them or find meaning in them and ultimately get to know themselves better.”





Source link

Leave a Comment