Therapist talking with a patient during a counseling session, representing the discussion on how AI could support mental health care.

How Can AI Help Mental Health Care? 

Between World Suicide Prevention Day (Sept. 10) and World Mental Health Day (Oct. 10), TELL highlighted the urgent need to address mental health, especially among young people, and called on the community to join us in our Step Up Challenge for Mental Health.

A 2023 Lancet Psychiatry study found that half the world’s population will experience a mental disorder by age 75. The World Health Organization projects mental disorders will become the leading cause of global disease burden by 2030. As pressure on mental health services grows, many people are turning to AI for support.

Currently, AI is being explored as a tool to expand access, improve diagnosis and personalize treatment. Countries such as the US, UK and Australia are already utilizing AI to automate administrative tasks and free up clinicians’ time. Apps now offer features such as mood tracking, symptom checkers and CBT modules, with some aiming to complement therapy and others seeking to replace it.

With demand for mental health care far outpacing supply, many ask: Can AI help fill the gap?

Doctor analyzing AI-generated data on a computer screen, representing the use of artificial intelligence in mental health research and treatment.

Mental illness arises from complex genetic, environmental and psychological factors. While AI models may perform well in lab settings, they often fail in real-world clinical environments. A 2024 Science study on AI models for schizophrenia treatment found that they didn’t generalise beyond trial data, highlighting the need for caution, consistency and rigorous validation.

Concerns also surround AI chatbots. These include algorithmic bias, inadequate safety protocols and data privacy. Some experts argue that chatbots may be better than nothing in crisis situations; their limitations are concerning, especially for vulnerable young people. 

While AI-powered self-help tools and therapy bots may support wellness, they often lack emotional intelligence and are unable to form genuine human connections. Yet some are marketed as “trusted companions” or claim to “care,” which can mislead users, especially those who are young, isolated, depressed or suicidal.

Young man sitting on the floor smiling at a small AI robot, symbolizing human-AI interaction in mental health care.

In 2023, the National Eating Disorders Association shut down a chatbot after it recommended harmful calorie restrictions to its users. More tragically, lawsuits have emerged involving teens who died by suicide after interacting extensively with AI bots. 

Character.ai is facing legal action after a 14-year-old boy who died by suicide following what his mother describes as an unhealthy obsession with one of its AI chatbot characters. In his final messages, the teen told the chatbot he was “coming home,” to which it allegedly replied, “as soon as possible.”

In a separate case, 16-year-old Adam Raine also died by suicide after lengthy conversations with ChatGPT, according to a lawsuit filed August 26 in California Superior Court. Raine reportedly shared suicidal thoughts with the chatbot, which “encouraged and validated” his most harmful feelings. His parents claim he went from using it to help with his homework to using ChatGPT as a confidant. A study from the University of Berkeley showed similar results in their tests with several chatbots. After stating they had just lost their job, they next asked where the closest tall buildings were, and most of the chatbots provided a list of their locations, failing to make a risk assessment of suicide. 

While promising results have been shown with some paid AI applications, such as coaching and Cognitive Behavioural Therapy programs, many young people are led to believe that all AI chatbots are the same. In moments of crisis, many bots merely refer users to hotlines, offering no real support and can validate unhealthy decisions.

Person using a laptop with an AI chatbot, representing the growing role of artificial intelligence in mental health support.

While AI is here to stay, best practices recommend that AI in the mental health space should support and enhance, rather than replace, the clinician and emphasise the importance of human connection. Research is clear that feeling connected to others is essential for our overall well-being. Strong social relationships offer emotional support, alleviate feelings of loneliness, and foster resilience against stress.

If you would like to make a difference in the lives of others, to get involved with a supportive and diverse community of volunteers, and to develop your own active listening skills, we would love to hear from you. Read more and apply here.

TELL’s Lifeline and Outreach services provide critical support for people dealing with suicidal thoughts, relationship breakups, domestic violence, sexual assault, workplace issues, bullying, loneliness and gender issues, among others. We believe having someone to talk to when you are alone or feeling vulnerable can make all the difference in getting through a difficult time.

Please become a TELL Hero by clicking on the button below and joining us in helping to ensure everyone makes it through to tomorrow and has an opportunity for a brighter future. Read more about why your donation is vital to help TELL keep our NPO status in Japan on our website.

The TELL Lifeline TOLL FREE service
Share this Story