Therapy chatbots might be worsening your mental health

Mental health cases are rising higher than ever after the mental health crisis that accompanied the Covid-19 pandemic. According to a publication in The Lancet last year, the pandemic triggered 76 million cases of anxiety and 53 million cases of depression worldwide.

Therapy chatbots are filling the gaps

In a world where mental health resources are limited, therapy chatbots are increasingly addressing the shortfall. Mental health chatbot app Wysa, launched in 2016, has been described as an ’emotionally intelligent’ artificial intelligence (AI) therapy chatbot and currently has three million users. It is currently being used in some London schools, while the UK’s National Health Service is also running randomized controlled trials (RCTs) to see if the app could be used for those on NHS mental health waiting lists.

In Singapore, Wysa was licensed by the government during the peak of the Covid-19 pandemic in 2020. As of June this year, the app received device designation from the US Food and Drug Administration (FDA). ) to treat anxiety and depressive disorders.

The market is currently unregulated.

It is still unclear how exactly a mental health chatbot can help a patient, and the research done on them is limited and often done by the companies that created them. The therapy chatbot market is unregulated and may just be creating the illusion of help. Most therapy chatbots are not required to have government approval and the FDA even loosened its rules around mental health apps to provide remote mental health support during the pandemic.

Clearly, there needs to be stricter regulations and rules about what these bots can and cannot say. The Woebot Chatbot app is one of the most controversial launches as it is based on both clinical research and AI. In 2018, when a user entered a statement mentioning that he was a minor being forced to have intercourse asking for help, the app simply replied, “It shows me how much you care about connection and that’s beautiful.”

The release of mental health chatbots should be limited until there is empirical evidence to support their use and effectiveness. At present, it appears that these platforms may be doing more harm than good. But with better research and regulations, mental health chatbots can play a bigger role in the mental health care system. Perhaps its effectiveness could be enhanced by combining AI with human intelligence.

Related companies



!function(f,b,e,v,n,t,s)
{if(f.fbq)return;n=f.fbq=function(){n.callMethod?
n.callMethod.apply(n,arguments):n.queue.push(arguments)};
if(!f._fbq)f._fbq=n;n.push=n;n.loaded=!0;n.version=’2.0′;
n.queue=[];t=b.createElement(e);t.async=!0;
t.src=v;s=b.getElementsByTagName(e)[0];
s.parentNode.insertBefore(t,s)}(window, document,’script’,
‘https://connect.facebook.net/en_US/fbevents.js’);
fbq(‘init’, ‘501151668227761’);
fbq(‘track’, ‘PageView’);

  Dr. Heena Sakhuja Breaks The Taboo Around Mental And Sexual Health With One Click Homoeopathy

Leave a Comment