When 31-year-old Rhea Sharma, was going through a rough patch at work, she downloaded a chatbot for help. “It helped me immediately. I could rant about my boss, my anxiety, my sleep issues,” says the Mumbai communications professional. The bot always reassured that her reactions made sense. Over time, it became part of her routine.
Six months later, when she finally began therapy, her psychologist noticed something telling. She was good at expressing her emotions, but was stuck in the same patterns. Almost always seeking reassurance, rather than challenging her behaviour.
The pattern is common in therapy rooms. Experts say, like Rhea, many now arrive with expectations shaped by chatbot-style interactions—seeking minimal friction, instant reassurance, and quick answers.
A 2025 Youth Pulse Survey of around 500 Indians aged 13 to 35, found that nearly 57 per cent use AI tools like chatbots for emotional support for something they consider too sensitive for the drawing room discussions.
From a clinical perspective, the pattern is unsurprising and the appeal is easy to understand. India faces a severe shortage of mental health professionals, especially outside urban centres. Therapy can cost anywhere from `1,500 to `4,000 per session, out of reach for many. Chatbots, by contrast, are always available, often free, judgement-free, and endlessly patient.
For generations more comfortable typing feelings into screens, these tools feel intuitive rather than intrusive. “Many users are typically looking for quick coping strategies, empathetic listening, and anonymity rather than deep clinical intervention,” says psychologist Dhara Ghuntla.
Therapy isn’t just about feeling heard. It involves being challenged, gently reflecting on patterns, and taking responsibility. Dr Santosh Bangar, consultant psychiatrist at Gleneagles Hospital says, “When emotional support is reduced to constant validation, therapy can lose its role in building strength.”
The distinction between support and treatment is critical. Chatbots cannot offer nuanced understanding, especially for complex or long-standing issues. Many chatbots are designed to maximise engagement. Challenging users too much risks losing them.
Healing demands reflection, and discomfort, things no algorithm is designed to offer. At least in its current form.