MyndStories Logo
STORIES
PODCASTS
SPACES
CORPORATES
ABOUT US
THERAPISTS
BOOK SUBSCRIPTION
TRY LUNA
MyndStoriesStartup India
  • Privacy Policy
  • Advertising policy
  • Guest Post Policy
  • Cancelation and Refund Policy
  • Terms & Conditions
  • Become a facilitator
  • Become a reviewer
  • Become a therapist
  • Ambassador Program
  • Write for us
  • Submission guidelines
  • Corporate Mental Health
  • Authors
  • FAQs
  • Contact Us
  • Chat with Luna
content@myndstories.com

The MyndStories website, content, and products provide information on general health, mental health, and related subjects for educational purposes only. The information here should not be considered as or substituted for professional medical advice, diagnosis, or treatment.

For emergencies, please call

KIRAN - India's first 24/7 national toll-free mental health helpline - 1800-599-0019

©2025 All rights reserved by MyndStories (Metta Media Pvt Ltd)

Apps

The AI therapist will see you now: Can chatbots really improve mental health?

July 15, 2025

The AI therapist will see you now: Can chatbots really improve mental health?
Written by Pooja Shree Chettiar

Recently, I found myself pouring my heart out, not to a human, but to a chatbot named Wysa on my phone. It nodded – virtually – asked me how I was feeling and gently suggested trying breathing exercises.

As a neuroscientist, I couldn’t help but wonder: Was I actually feeling better, or was I just being expertly redirected by a well-trained algorithm? Could a string of code really help calm a storm of emotions?

Artificial intelligence-powered mental health tools are becoming increasingly popular – and increasingly persuasive. But beneath their soothing prompts lie important questions: How effective are these tools? What do we really know about how they work? And what are we giving up in exchange for convenience?

Of course it’s an exciting moment for digital mental health. But understanding the trade-offs and limitations of AI-based care is crucial.

Stand-in meditation and therapy apps and bots

AI-based therapy is a relatively new player in the digital therapy field. But the U.S. mental health app market has been booming for the past few years, from apps with free tools that text you back to premium versions with an added feature that gives prompts for breathing exercises.

Headspace and Calm are two of the most well-known meditation and mindfulness apps, offering guided meditations, bedtime stories and calming soundscapes to help users relax and sleep better. 

Talkspace and BetterHelp go a step further, offering actual licensed therapists via chat, video or voice. The apps Happify and Moodfit aim to boost mood and challenge negative thinking with game-based exercises.

The AI therapist will see you now: Can chatbots really improve mental health?

Somewhere in the middle are chatbot therapists like Wysa and Woebot, using AI to mimic real therapeutic conversations, often rooted in cognitive behavioral therapy. These apps typically offer free basic versions, with paid plans ranging from US$10 to $100 per month for more comprehensive features or access to licensed professionals.

While not designed specifically for therapy, conversational tools like ChatGPT have sparked curiosity about AI’s emotional intelligence.

Some users have turned to ChatGPT for mental health advice, with mixed outcomes, including a widely reported case in Belgium where a man died by suicide after months of conversations with a chatbot. Elsewhere, a father is seeking answers after his son was fatally shot by police, alleging that distressing conversations with an AI chatbot may have influenced his son’s mental state. These cases raise ethical questions about the role of AI in sensitive situations.

Where AI comes in

Whether your brain is spiraling, sulking or just needs a nap, there’s a chatbot for that. But can AI really help your brain process complex emotions? Or are people just outsourcing stress to silicon-based support systems that sound empathetic?

And how exactly does AI therapy work inside our brains?

Most AI mental health apps promise some flavor of cognitive behavioral therapy, which is basically structured self-talk for your inner chaos. Think of it as Marie Kondo-ing, the Japanese tidying expert known for helping people keep only what “sparks joy.” You identify unhelpful thought patterns like “I’m a failure,” examine them, and decide whether they serve you or just create anxiety.

But can a chatbot help you rewire your thoughts?

Surprisingly, there’s science suggesting it’s possible. Studies have shown that digital forms of talk therapy can reduce symptoms of anxiety and depression, especially for mild to moderate cases. In fact, Woebot has published peer-reviewed research showing reduced depressive symptoms in young adults after just two weeks of chatting.

These apps are designed to simulate therapeutic interaction, offering empathy, asking guided questions and walking you through evidence-based tools. The goal is to help with decision-making and self-control, and to help calm the nervous system.

The neuroscience behind cognitive behavioral therapy is solid: It’s about activating the brain’s executive control centers, helping us shift our attention, challenge automatic thoughts and regulate our emotions.

The question is whether a chatbot can reliably replicate that, and whether our brains actually believe it.

A user’s experience, and what it might mean for the brain

“I had a rough week,” a friend told me recently. I asked her to try out a mental health chatbot for a few days. She told me the bot replied with an encouraging emoji and a prompt generated by its algorithm to try a calming strategy tailored to her mood. Then, to her surprise, it helped her sleep better by week’s end.

As a neuroscientist, I couldn’t help but ask: Which neurons in her brain were kicking in to help her feel calm?

This isn’t a one-off story. A growing number of user surveys and clinical trials suggest that cognitive behavioral therapy-based chatbot interactions can lead to short-term improvements in mood, focus and even sleep. In randomized studies, users of mental health apps have reported reduced symptoms of depression and anxiety – outcomes that closely align with how in-person cognitive behavioral therapy influences the brain.

The AI therapist will see you now: Can chatbots really improve mental health?

Several studies show that therapy chatbots can actually help people feel better. In one clinical trial, a chatbot called “Therabot” helped reduce depression and anxiety symptoms by nearly half – similar to what people experience with human therapists. Other research, including a review of over 80 studies, found that AI chatbots are especially helpful for improving mood, reducing stress and even helping people sleep better. In one study, a chatbot outperformed a self-help book in boosting mental health after just two weeks.

While people often report feeling better after using these chatbots, scientists haven’t yet confirmed exactly what’s happening in the brain during those interactions. In other words, we know they work for many people, but we’re still learning how and why.

Red flags and risks

Apps like Wysa have earned FDA Breakthrough Device designation, a status that fast-tracks promising technologies for serious conditions, suggesting they may offer real clinical benefit. Woebot, similarly, runs randomized clinical trials showing improved depression and anxiety symptoms in new moms and college students.

While many mental health apps boast labels like “clinically validated” or “FDA approved,” those claims are often unverified. A review of top apps found that most made bold claims, but fewer than 22% cited actual scientific studies to back them up.

In addition, chatbots collect sensitive information about your mood metrics, triggers and personal stories. What if that data winds up in third-party hands such as advertisers, employers or hackers, a scenario that has occurred with genetic data? In a 2023 breach, nearly 7 million users of the DNA testing company 23andMe had their DNA and personal details exposed after hackers used previously leaked passwords to break into their accounts. Regulators later fined the company more than $2 million for failing to protect user data.

Unlike clinicians, bots aren’t bound by counseling ethics or privacy laws regarding medical information. You might be getting a form of cognitive behavioral therapy, but you’re also feeding a database.

And sure, bots can guide you through breathing exercises or prompt cognitive reappraisal, but when faced with emotional complexity or crisis, they’re often out of their depth. Human therapists tap into nuance, past trauma, empathy and live feedback loops. Can an algorithm say “I hear you” with genuine understanding? Neuroscience suggests that supportive human connection activates social brain networks that AI can’t reach.

So while in mild to moderate cases bot-delivered cognitive behavioral therapy may offer short-term symptom relief, it’s important to be aware of their limitations. For the time being, pairing bots with human care – rather than replacing it – is the safest move.

Pooja Shree Chettiar, PhD Candidate in Digital Psychiatry, Ph.D. Candidate in Medical Sciences, Texas A&M University.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Help support mental health

Every mind matters. Every donation makes a difference. Together, we can break down stigmas and create a more compassionate world.

Disclaimer: MyndStories is not a non-profit. We are a private limited company registered as Metta Media Pvt Ltd. We don't fall under Section 80G and hence you don't get a tax exemption for your contribution.

ai therapychatbotstherapy

Recent Posts

  • Manotsava 2025: Bengaluru’s mental health festival sparks dialogue and hope
    Manotsava 2025: Bengaluru’s mental health festival sparks dialogue and hope

    by Team MyndStories

  • Opening the window within: The journey of Kidiki
    Opening the window within: The journey of Kidiki

    by Seema Lal

  • Santhe 4.0: Bengaluru comes together for mental health at NIMHANS
    Santhe 4.0: Bengaluru comes together for mental health at NIMHANS

    by Team MyndStories

  • Major national school mental health campaign concluded: CBSE and AIIMS bring experts into classrooms
    Major national school mental health campaign concluded: CBSE and AIIMS bring experts into classrooms

    by Team MyndStories

  • When “I’m fine” isn’t fine: Introducing the “UnFine” campaign by Anna Chandy & Associates
    When “I’m fine” isn’t fine: Introducing the “UnFine” campaign by Anna Chandy & Associates

    by Team MyndStories

We're on Instagram

MyndStories

@wearemyndstories

MyndStories

Today is Enid Blyton's birthdaySay hi to Itsy!
What does grief sound like?Swipe. Match. Chat. Ghost.Ever feel like your brain is on turbo
Have you ever caught yourself repeating the same patterns

Editor's Picks

  • Shilpa Kulkarni: A story of loss and resilience
    Shilpa Kulkarni: A story of loss and resilience

    by Team MyndStories

  • How to set healthy boundaries with your family
    How to set healthy boundaries with your family

    by Megha Kadam

Related Articles

Anima Nair: Becoming, unbecoming, and becoming again
Autism, Features, Interview, Neurodiversity

Anima Nair: Becoming, unbecoming, and becoming again

Seema Lal
AIIMS Delhi launches ‘Never Alone’ app for students
Apps, Mental Health Tools, News, Tools and Tech

AIIMS Delhi launches ‘Never Alone’ app for students

Team MyndStories
How an Indian therapist holds space for herself every day: Talking space with Poonam Malpani
Features, Interview, Therapy

How an Indian therapist holds space for herself every day: Talking space with Poonam Malpani

Nikitha Warriar