MyndStories Logo
STORIES
PODCASTS
SPACES
CORPORATES
ABOUT US
THERAPISTS
BOOK SUBSCRIPTION
TRY LUNA
MyndStoriesStartup India
  • Privacy Policy
  • Advertising policy
  • Guest Post Policy
  • Cancelation and Refund Policy
  • Terms & Conditions
  • Become a facilitator
  • Become a reviewer
  • Become a therapist
  • Ambassador Program
  • Write for us
  • Submission guidelines
  • Corporate Mental Health
  • Authors
  • FAQs
  • Contact Us
  • Chat with Luna
content@myndstories.com

The MyndStories website, content, and products provide information on general health, mental health, and related subjects for educational purposes only. The information here should not be considered as or substituted for professional medical advice, diagnosis, or treatment.

For emergencies, please call

KIRAN - India's first 24/7 national toll-free mental health helpline - 1800-599-0019

©2025 All rights reserved by MyndStories (Metta Media Pvt Ltd)

Apps

Do bots understand trauma? Inside the world of AI in mental health

May 1, 2025

Do bots understand trauma? Inside the world of AI in mental health
Written by Sumit Singla

Trigger warning: Mentions of anxiety, grief, and suicidal thoughts.

The glow of her phone screen was the only light piercing through the dark. Shadows clung to the walls. Rachita* couldn’t breathe. Her chest tightened. Panic, familiar and relentless, curled its way back into her throat.

Not again.

Her fingers hovered over the phone. At the app. Serene AI promised comfort. No questions. No judgment. No shame. Just help and understanding. She hesitated. But desperation won. It felt like a lifeline. Yet even in that moment, a quiet doubt surfaced:

Can an algorithm really understand pain?

One in every seven Indians is estimated to be living with a mental health disorder (Shidhaye, 2020). That makes it nearly 2.5 million people in need of help in the city of Delhi alone.

But with an abysmal practitioner-patient ratio of 0.75:100,000, there are likely fewer than 20 mental health professionals available. And even when help is available, stigma remains a massive barrier. Nearly three out of four people believe individuals with mental illnesses shouldn’t be given responsibility and blame mental health conditions on a lack of willpower (Thomas, 2018).

A growing body of research from institutions like Stanford and MIT suggests AI can replicate therapeutic techniques like cognitive reframing and mood tracking with reasonable consistency. But researchers warn: these tools function more like first-aid than long-term treatment.

No wonder people are reluctant to reach for help. And the ones who do, don’t always find it.

Do bots understand trauma? Inside the world of AI in mental health

Can AI evolve enough to lead folks like Rachita back from the brink? AI-powered mental health apps promise support to our underserved millions. They’re accessible 24×7. No traffic to deal with. No lengthy explanations to parents, colleagues, or friends.

But is AI the hero we need for our mental health crisis? Or just a temporary Band-Aid on a deep wound?

A lifeline in the dark?

“I had tried therapy before. It was expensive, required too much coordination, and… made me uncomfortable. Sitting across from a stranger and unpacking my deepest fears? No, thanks.”

Aakash, a 34-year-old consultant in Delhi, wasn’t sold on traditional therapy.

“Plus, I was always on the move, traveling to client locations and working long hours. Consulting someone in person sounded more stressful than helpful.”

AI, on the other hand, was easy. No awkward silences. No feeling like a burden. The VOS.health app became his go-to on bad days. It helped him structure his thoughts, journal his emotions, and track patterns in his anxiety. Like Wysa, Calm, or Amaha, it offered self-guided support through journaling and mood tracking. He convinced himself he didn’t need a human therapist. The app was enough.

“That’s what I told myself. Until my father passed away.”

The loss hit harder than he expected. He turned to the app, pouring his grief into the chat. He typed about his guilt, his regrets, and his grief.

“The app responded with soft reassurances: Loss can be difficult. Be kind to yourself.“

It offered meditation exercises. Sent articles on grief. Reminded him that healing takes time.

“But its responses left me cold. There was nothing wrong with the words, but they weren’t right.”

It didn’t ask who his father was. What memories hurt the most. What he wished he could say. It responded to inputs, not pain. It didn’t grasp the texture of his grief.

“That night, I wished I had a real person to talk to. I knew… AI could be a crutch, not a cure.”

This reflects a broader limitation of Large Language Models (LLMs) in AI therapy: while they can generate coherent responses, they cannot process emotional subtext or provide attuned responses based on previous sessions or individual history. In a 2023 paper published in Nature Human Behaviour, scientists found that while AI chatbots like Woebot can reduce immediate anxiety levels in users, they lack the ability to build the “therapeutic alliance” that is a strong predictor of recovery.

When AI opens the door

For 52-year-old Meenal*, a self-employed architect in Mumbai, AI was less of a solution, more of a starting point.

“In our times, we didn’t have all this talk around mental health. We just handled things,” she says.

When she tried talking to friends about her anxiety and low moods, most of them scoffed. Some offered the usual advice:

“Go get a drink, yaaaaaar. Or maybe, go on a date.”

She didn’t think a therapist would understand. Most seemed to be in their mid-twenties. Opening up to someone half her age felt like another hurdle.

So when she kept hearing people say, “There’s an app for everything,” she decided to download one. Out of curiosity. Just to see.

At first, the responses felt generic, robotic. But over time, Meenal found herself thinking more deeply about long-buried thoughts. The app nudged her gently. “Maybe it’s time to talk to someone.” Eventually, she booked an appointment at the Amaha Mental Health Centre in Bandra.

“I was so nervous I nearly walked out. But I stayed. I wouldn’t have come if it weren’t for AI. It made therapy feel… less scary.”

According to the 2022 Deloitte Global Health Survey, digital apps are often a “first step” toward formal care for many users, especially those from older or marginalized groups. This bridges the accessibility gap and reduces therapy apprehension.

When insight wasn’t enough

Sneha*, based in Germany, had been in therapy for years. But she was curious. Could ChatGPT become a therapist on demand?

AI seemed to offer everything she needed: low cost, high convenience, and zero logistics.

It felt promising at first. She got neat summaries of coping mechanisms and behavioral patterns. But soon, the lack of depth began to show. The advice was accurate, sure. But the voice behind it? Empty.

Do bots understand trauma? Inside the world of AI in mental health

“The conversations lacked memory, nuance, and personalization. It never felt… real.”

She also used Mindsera, a journaling and mood-tracking app that included crisis support. Still, when she shared deeply personal moments, the replies felt mechanical.

“It’s an algorithm – it gives what you feed it. That can be counterproductive. It does prompt you to seek human help. But that isn’t enough.”

AI, Sneha realised, could guide her. But it couldn’t truly see her.

“I still use ChatGPT. But the nuance and skill of a human therapist can’t be replicated. At least, not yet.”

The therapist’s dilemma

Dr Meera Khanna* has spent nearly 15 years helping people unpack trauma, anxiety, and grief.

“In each session, I listen between the words. For silence. For a trembling voice. For fidgety hands.”

She prides herself on understanding the subtleties that make each person unique. So when a patient casually mentioned using an AI-powered mental health app, her first reaction was one of alarm.

“Therapy is about connection. About knowing when silence is more powerful than words, when to push, and when to pause. An algorithm can’t replicate that.”

Dr Khanna isn’t alone. Many mental health professionals worry AI will lure people away from real help.

“Imagine someone with childhood trauma,” she says. “They tell the app they feel unworthy. The app replies with a positive affirmation. But what if that reinforces their feeling that no one truly sees them?”

A 2022 World Economic Forum briefing warned that many apps fail to clarify data privacy protocols, raising concerns about user confidentiality, especially in vulnerable populations.

What if they stop seeking real help? Dr. Khanna acknowledges the usefulness of AI. It can offer immediate support, especially when professionals aren’t available. But she worries: Will people treat it as a stepping stone to real therapy or a substitute for it?

She also raises concerns around confidentiality. “Mental health data is sensitive. Who sees it? Who owns it?”

A helping hand, not a replacement

AI-powered mental health platforms aren’t the enemy. But they aren’t the answer, either.

They are tools. Tools that can offer a hand in the dark but not a way out of it. Tools that can support, but not substitute. Tools that may nudge someone toward therapy, but not walk them through healing. In India, a 2021 review by the National Institute of Mental Health and Neurosciences (NIMHANS) pointed to the rise in digital-first interventions but emphasized that AI tools should be viewed as triage—not treatment. They are most effective when used to increase access, reduce stigma, and funnel users into the formal care system.

For Aakash, it was a temporary crutch. For Meenal, a gentle bridge. For Sneha, a helpful tool that fell short. For Dr Khanna, a cautionary tale. And for Rachita, it was something to hold onto in the dark, even if only for a while.

The question isn’t whether we choose between AI and human therapy. It’s how we build a system where both work together. And for Rachita, in that moment of panic, it was a light. Not the sun, maybe, but enough to get through the night.

And sometimes, that is enough.

*Names changed for privacy.

📱 Thinking of using an AI app for your mental health? Here’s what to keep in mind

From journaling prompts to mindfulness meditations to AI-powered chatbots, there’s no shortage of digital mental health tools in India today. Popular names include:

  • Wysa – A chatbot that helps you track emotions and build emotional resilience.
  • Calm / Headspace – Mindfulness and meditation platforms are great for managing stress.
  • Amaha (InnerHour) – Offers structured programs and therapist access.
  • MANAS – A government-led app for mental well-being and awareness.

Before choosing one, ask yourself:

  • Who built it? Is it backed by mental health professionals or research?
  • Where does your data go? Read the privacy policy before you sign up.
  • What do you need? A mood tracker? A journaling tool? A therapist in your pocket?
  • How much does it cost? Some are free. Others are pricey subscriptions.

Help support mental health

Every mind matters. Every donation makes a difference. Together, we can break down stigmas and create a more compassionate world.

Disclaimer: MyndStories is not a non-profit. We are a private limited company registered as Metta Media Pvt Ltd. We don't fall under Section 80G and hence you don't get a tax exemption for your contribution.

ai therapymental healththerapytools and tech

Recent Posts

  • Manotsava 2025: Bengaluru’s mental health festival sparks dialogue and hope
    Manotsava 2025: Bengaluru’s mental health festival sparks dialogue and hope

    by Team MyndStories

  • Opening the window within: The journey of Kidiki
    Opening the window within: The journey of Kidiki

    by Seema Lal

  • Santhe 4.0: Bengaluru comes together for mental health at NIMHANS
    Santhe 4.0: Bengaluru comes together for mental health at NIMHANS

    by Team MyndStories

  • Major national school mental health campaign concluded: CBSE and AIIMS bring experts into classrooms
    Major national school mental health campaign concluded: CBSE and AIIMS bring experts into classrooms

    by Team MyndStories

  • When “I’m fine” isn’t fine: Introducing the “UnFine” campaign by Anna Chandy & Associates
    When “I’m fine” isn’t fine: Introducing the “UnFine” campaign by Anna Chandy & Associates

    by Team MyndStories

We're on Instagram

MyndStories

@wearemyndstories

MyndStories

Today is Enid Blyton's birthdaySay hi to Itsy!
What does grief sound like?Swipe. Match. Chat. Ghost.Ever feel like your brain is on turbo
Have you ever caught yourself repeating the same patterns

Editor's Picks

  • Shilpa Kulkarni: A story of loss and resilience
    Shilpa Kulkarni: A story of loss and resilience

    by Team MyndStories

  • How to set healthy boundaries with your family
    How to set healthy boundaries with your family

    by Megha Kadam

Related Articles

Manotsava 2025: Bengaluru’s mental health festival sparks dialogue and hope
Mental Health, News

Manotsava 2025: Bengaluru’s mental health festival sparks dialogue and hope

Team MyndStories
Opening the window within: The journey of Kidiki
Interview, Mental Health, Mental Health Tools, Startups

Opening the window within: The journey of Kidiki

Seema Lal
Santhe 4.0: Bengaluru comes together for mental health at NIMHANS
Mental Health, News, Suicide

Santhe 4.0: Bengaluru comes together for mental health at NIMHANS

Team MyndStories