MyndStories Logo
STORIES
PODCASTS
SPACES
CORPORATES
ABOUT US
THERAPISTS
BOOK SUBSCRIPTION
TRY LUNA
MyndStoriesStartup India
  • Privacy Policy
  • Advertising policy
  • Guest Post Policy
  • Cancelation and Refund Policy
  • Terms & Conditions
  • Become a facilitator
  • Become a reviewer
  • Become a therapist
  • Ambassador Program
  • Write for us
  • Submission guidelines
  • Corporate Mental Health
  • Authors
  • FAQs
  • Contact Us
  • Chat with Luna
content@myndstories.com

The MyndStories website, content, and products provide information on general health, mental health, and related subjects for educational purposes only. The information here should not be considered as or substituted for professional medical advice, diagnosis, or treatment.

For emergencies, please call

KIRAN - India's first 24/7 national toll-free mental health helpline - 1800-599-0019

©2025 All rights reserved by MyndStories (Metta Media Pvt Ltd)

Features

AI therapy may help with mental health, but innovation should never outpace ethics

May 15, 2025

AI therapy may help with mental health, but innovation should never outpace ethics
Written by Ben Bond

Mental health services around the world are stretched thinner than ever. Long wait times, barriers to accessing care and rising rates of depression and anxiety have made it harder for people to get timely help.

As a result, governments and healthcare providers are looking for new ways to address this problem. One emerging solution is the use of AI chatbots for mental health care.

AI therapy may help with mental health, but innovation should never outpace ethics

A recent study explored whether a new type of AI chatbot, named Therabot, could treat people with mental illness effectively. The findings were promising: not only did participants with clinically significant symptoms of depression and anxiety benefit, those at high-risk for eating disorders also showed improvement. While early, this study may represent a pivotal moment in the integration of AI into mental health care.

AI mental health chatbots are not new – tools like Woebot and Wysa have already been released to the public and studied for years. These platforms follow rules based on a user’s input to produce a predefined approved response.

What makes Therabot different is that it uses generative AI – a technique where a program learns from existing data to create new content in response to a prompt. Consequently, Therabot can produce novel responses based on a user’s input like other popular chatbots such as ChatGPT, allowing for a more dynamic and personalized interaction.

This isn’t the first time generative AI has been examined in a mental health setting. In 2024, researchers in Portugal conducted a study where ChatGPT was offered as an additional component of treatment for psychiatric inpatients.

The research findings showed that just three to six sessions with ChatGPT led to a significantly greater improvement in quality of life than standard therapy, medication and other supportive treatments alone.

Together, these studies suggest that both general and specialised generative AI chatbots hold real potential for use in psychiatric care. But there are some serious limitations to keep in mind. For example, the ChatGPT study involved only 12 participants – far too few to draw firm conclusions.

In the Therabot study, participants were recruited through a Meta Ads campaign, likely skewing the sample toward tech-savvy people who may already be open to using AI. This could have inflated the chatbot’s effectiveness and engagement levels.

Ethics and exclusion

Beyond methodological concerns, there are critical safety and ethical issues to address. One of the most pressing is whether generative AI could worsen symptoms in people with severe mental illnesses, particularly psychosis.

AI therapy may help with mental health, but innovation should never outpace ethics

A 2023 article warned that generative AI’s lifelike responses, combined with the most people’s limited understanding of how these systems work, might feed into delusional thinking. Perhaps for this reason, both the Therabot and ChatGPT studies excluded participants with psychotic symptoms.

But excluding these people also raises questions of equity. People with severe mental illness often face cognitive challenges – such as disorganised thinking or poor attention – that might make it difficult to engage with digital tools.

Ironically, these are the people who may benefit the most from accessible, innovative interventions. If generative AI tools are only suitable for people with strong communication skills and high digital literacy, then their usefulness in clinical populations may be limited.

There’s also the possibility of AI “hallucinations” – a known flaw that occurs when a chatbot confidently makes things up – like inventing a source, quoting a nonexistent study, or giving an incorrect explanation. In the context of mental health, AI hallucinations aren’t just inconvenient, they can be dangerous.

Imagine a chatbot misinterpreting a prompt and validating someone’s plan to self-harm, or offering advice that unintentionally reinforces harmful behaviour. While the studies on Therabot and ChatGPT included safeguards – such as clinical oversight and professional input during development – many commercial AI mental health tools do not offer the same protections.

That’s what makes these early findings both exciting and cautionary. Yes, AI chatbots might offer a low-cost way to support more people at once, but only if we fully address their limitations.

Effective implementation will require more robust research with larger and more diverse populations, greater transparency about how models are trained and constant human oversight to ensure safety. Regulators must also step in to guide the ethical use of AI in clinical settings.

With careful, patient-centred research and strong guardrails in place, generative AI could become a valuable ally in addressing the global mental health crisis – but only if we move forward responsibly.

Ben Bond, PhD Candidate in Digital Psychiatry, RCSI University of Medicine and Health Sciences

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Help support mental health

Every mind matters. Every donation makes a difference. Together, we can break down stigmas and create a more compassionate world.

Disclaimer: MyndStories is not a non-profit. We are a private limited company registered as Metta Media Pvt Ltd. We don't fall under Section 80G and hence you don't get a tax exemption for your contribution.

ai therapytherapytools and tech

Recent Posts

  • Manotsava 2025: Bengaluru’s mental health festival sparks dialogue and hope
    Manotsava 2025: Bengaluru’s mental health festival sparks dialogue and hope

    by Team MyndStories

  • Opening the window within: The journey of Kidiki
    Opening the window within: The journey of Kidiki

    by Seema Lal

  • Santhe 4.0: Bengaluru comes together for mental health at NIMHANS
    Santhe 4.0: Bengaluru comes together for mental health at NIMHANS

    by Team MyndStories

  • Major national school mental health campaign concluded: CBSE and AIIMS bring experts into classrooms
    Major national school mental health campaign concluded: CBSE and AIIMS bring experts into classrooms

    by Team MyndStories

  • When “I’m fine” isn’t fine: Introducing the “UnFine” campaign by Anna Chandy & Associates
    When “I’m fine” isn’t fine: Introducing the “UnFine” campaign by Anna Chandy & Associates

    by Team MyndStories

We're on Instagram

MyndStories

@wearemyndstories

MyndStories

Today is Enid Blyton's birthdaySay hi to Itsy!
What does grief sound like?Swipe. Match. Chat. Ghost.Ever feel like your brain is on turbo
Have you ever caught yourself repeating the same patterns

Editor's Picks

  • Shilpa Kulkarni: A story of loss and resilience
    Shilpa Kulkarni: A story of loss and resilience

    by Team MyndStories

  • How to set healthy boundaries with your family
    How to set healthy boundaries with your family

    by Megha Kadam

Related Articles

Opening the window within: The journey of Kidiki
Interview, Mental Health, Mental Health Tools, Startups

Opening the window within: The journey of Kidiki

Seema Lal
When “I’m fine” isn’t fine: Introducing the “UnFine” campaign by Anna Chandy & Associates
Mental Health, Mental Health Tools, News

When “I’m fine” isn’t fine: Introducing the “UnFine” campaign by Anna Chandy & Associates

Team MyndStories
Anima Nair: Becoming, unbecoming, and becoming again
Autism, Features, Interview, Neurodiversity

Anima Nair: Becoming, unbecoming, and becoming again

Seema Lal