

In September, parents of American teenager Adam Raine sued Open AI over the death of their son. They alleged that the AI tech firm’s chatbot, ChatGPT, had encouraged the 16-year-old to take his own life.
The couple produced chat logs that showed how when Adam confided to the AI chatbot about his suicidal tendencies, the latter, instead of lending help, discouraged him from talking to his parents about the state of mind, or seek help.
Even disturbing was how it aided the youngster in writing the suicide note, the logs reveal.
The story, no doubt, made headlines across the world. In Kerala, where psychiatrists and mental health professionals were already discussing the unchecked penetration of AI into the domain of mental health, the matter was as prescient as it was pressing.
Several of us are already using chatbots. To find answers to life’s problems or other big mysteries of the universe, or for something as trivial as building a grocery shopping list.
Therapy, too, comes in the mix.
This is not a new phenomenon. Even before the advent of AI, people were leaning on Google searches to self-diagnose, experts point out.
“It was a trend, wasn’t it? Your digital personas are out of the reach of social stigmas. They aid you in your quest to identify yourself, your nuances and build you up better,” says Nisha M S, psychologist and assistant professor at Chinmaya Vishwa Vidyapeeth.
But accessibility didn’t always mean correctness.
“There was a period when feeling even a tinge of sadness got many to self-diagnose themselves as being in a state of depression. The same is the case with being identified as neurodivergent or having ADHD. Now, as more celebrities come forward to talk about their autism, several more have self-identified themselves as autistic,” she adds.
In addition, people began analysing every break-up, every crisis, every character trait with the help of Google. This is not wrong, but simply the first step in a long road. “The problems compound when people miss the crucial next step, i.e. seeking help from a trained therapist, especially a psychiatrist,” Nisha says.
According to her, only a small percentage of people who embark on this therapy journey come through her doors, or that of a trained therapist. “The state of others remains obscure,” she adds.
This is precisely what worries many mental health professionals — when self-diagnosing wades into dangerous territories.
“Numerous ‘therapy’ centres have mushroomed, both online and offline, after the Covid. Most are without trained staff or even the right credentials,” points out psychiatrist Dr Chitra Venkateswaran, founder of the Mehak Foundation.
“Therapy can be done online. There’s no problem with it, and neither is it secondary to having an offline experience. But the only matter of concern is: who is providing the therapy,” Dr Chitra explains.
Nisha concurs. “The concerns stem from misdiagnosis, leaving conditions untreated, or forcing in needless treatments where no intervention was necessary. Sadly, India does not yet have a governing or licensing body for mental health therapists,” she says.
This, she points out, affects quality control, monitoring, and research in the field.
Now, coupled with all that, we have AI. Nisha is in the beginning stage of a study regarding just that — why people choose AI for therapy and what the outcome of it is.
“One thing I have noticed is that there is no age barrier as such. Youngsters, the elderly, middle-aged — everyone seeks out AI. However, they are not using the dedicated ones built for therapy. Instead, just lean on generative AI chatbots,” she says.
AI uses patterns, the queries and tasks asked so far, to provide an appropriate response. “If I use a generic AI chatbot regularly, its response might be what I want to hear. So it might diagnose you with what you want to be diagnosed with,” Nisha explains.
Recently, a 20-year-old student had approached psychiatrist Dr Arun B Nair of Thiruvananthapuram Medical College for consultation. Earlier, the lad had sought the help of AI to discern why he was anxious during class presentations and why he didn’t have many friends. To this, the chatbot diagnosed him with autism. The student developed depression on learning this news and stopped going to college after.
Dr Arun categorically says that mental health issues cannot be diagnosed by ticking a few check boxes. “But that is what an AI chatbot does. Autism, social anxiety disorder, depression, ADHD, etc., have overlapping symptoms. AI cannot observe your lifestyle, your behaviour, talk to you in detail about your childhood, observe how you respond, and correctly diagnose you. It can only tick a few boxes and give you an answer,” he says.
The young student, Dr Arun explains, had symptoms of social anxiety, which were easily curable. “What AI ended up doing was aggravating and adding on to his existing problem,” he says.
Dr Chitra chimes in, “It’s not wrong to use AI or Google. It helps you self-identify in some ways. However, you have to seek help from a trained professional afterwards. And the AI should ideally suggest just that.”
A healthy way of using AI, experts suggest, is to use it as an add-on along with therapy with a professional. “For some, it might be helpful, especially under the supervision of a trained therapist. But beware! Using it in isolation might end up doing more damage,” says Dr Arun.
Lack of resources
According to the World Health Organization, there should be 1.7 psychiatrists per 100,000 population. However, as per the Indian Journal of Psychiatry, in India, there are only about 0.75 psychiatrists per 100,000 people. There are many reasons why people might seek online therapy or AI, says Nisha. One, lack of resources, including trained and skilled professionals. “Sometimes, the waiting list for an appointment with a well-known therapist might be as long as three months,” she says. “Then, there is the issue of cost. Therapy is not cheap, especially as experts are very much in demand.” Add to that the social stigma of visiting a mental health professional or a facility. According to Dr Arun B Nair, psychiatrists and therapists are mainly concentrated in the cities. “The reasons many cite are that in rural areas or villages, the stigma remains high, and people are reluctant to approach a therapist. Whereas in urban areas, the stigma is considerably lower, so is the fear of someone recognising you,” he says.
How to choose a therapist?
“This is the age of Instagram gurus and life coaches. So, it is important to choose wisely, according to the problem,” says Dr Edwin Peter, founder of NGO Sex Education Kerala (SEK). He suggests first identifying whether the symptoms have an impact on your daily life. “If not, and you’re able to complete your daily tasks, then you can visit a consultant psychologist,” he says. To become a consultant psychologist in India, one must first have a bachelor’s degree in psychology, followed by a specialised master’s degree.
“However, if you find that you, or someone near you, have symptoms that severely impact your daily life, have mood swings, etc., visit a clinical psychologist or a psychiatrist,” he says. Only a psychiatrist can prescribe you medications or other physical treatments, he adds. In Kerala, there are currently only 136 clinical psychologists registered with the Kerala Mental Health Authority, he says. A clinical psychologist is required to have at least an MPhil in Clinical Psychology, he adds.A psychiatrist requires an MD (Doctor of Medicine) or DNB (Diplomate of National Board) in Psychiatry.
AI Psychosis
Overdependence on AI itself may become a mental health issue, and experts have named it AI psychosis or Chatbot psychosis. After using and depending on AI for months, sometimes an individual may become unable to separate what is real and what is not. “While some users report psychological benefits, several cases of concern are emerging, including reports of suicide, violence, and delusional thinking linked to perceived emotional relationships with chatbots, one study by Cornell University says. “To understand the risk, we need more research. Especially now, given how problems are surfacing. We need to catch up quickly,” says Nisha, who has begun her own study in the field.