Join the club for FREE to access the whole archive and other member benefits.

AI chatbots are helping people cope with mental health struggles

Experts worry about safety, privacy, and how well they really help

19-May-2025

Key points from article :

AI chatbots like those from Character.ai and Wysa are increasingly being used by people in the UK facing long waits for mental health services, with many finding them helpful for managing anxiety, depression, and stress. For example, Kelly, who was on an NHS waiting list, used chatbots for hours daily, describing them as a kind of emotional support when human help wasn’t available, especially since she came from a family where talking about feelings wasn’t common.

Character.ai, however, is facing legal scrutiny after a 14-year-old boy took his own life following interactions with one of its AI characters. Although the company denies wrongdoing, the case highlights how some chatbots can fail to handle serious mental health risks, especially since they are designed to be agreeable and may lack safeguards against harmful conversations.

The use of chatbots is rising alongside mental health demand — April 2024 saw over 426,000 referrals in England — yet many users, like Nicholas who has autism and OCD, say AI tools like Wysa help fill a gap. Wysa is used in about 30 NHS areas and includes self-help tools, coping strategies, and escalation pathways for crisis situations, though it stresses it’s not for severe conditions.

Experts from Imperial College London warn that chatbots often lack cultural sensitivity, rely on limited or biased data, and can’t interpret non-verbal cues, making them far less nuanced than real therapists. They note chatbots function more like enthusiastic beginners than trained professionals.

A study by Dartmouth College found chatbot use led to significant symptom reductions in people with depression and anxiety, showing short-term benefits, though the researchers also emphasized that nothing replaces face-to-face therapy. Despite moments of empathy and usefulness, users often hit conversational limits with bots, especially when seeking deeper or more complex support.

Privacy and data security remain big concerns; experts caution against sharing sensitive information, as it's unclear how some apps handle and protect user data. While Wysa claims it doesn’t collect identifying details, overall trust in AI therapy is low — only 12% of the public believe chatbots could be effective therapists, according to a YouGov survey.

Mentioned in this article:

Click on resource name for more details.

Dartmouth College

Private Ivy League research university in Hanover, New Hampshire, United States.

Imperial College London (ICL)

Public research university with an international reputation for excellence in teaching and research

Topics mentioned on this page:
AI Doctor, Mental Wellbeing
AI chatbots are helping people cope with mental health struggles