Can a bot fix your blues?

When you are stressed or anxious, you could try chatting with this new app.  it may help you relax or direct you to a therapist, but it is a work in progress

There is a shortage of therapists, a quick calculation shows there is just one for every 20 lakh. Add to this the stigma and fear people have of approaching a mental health professional, and the app Wysa could not have come at a better time. 


This app, its beta version, has been launched by Jyotsana Aggarwal and Ramakant Vempati who founded Touchkin. Touchkin is a startup that develops technologies for behavioural health support, particularly for people with chronic illness and depression. This app is their latest and is essentially a chatbot that provides algorithm-driven counselling, and Jyotsana says that there is a provision to connect to professional counsellors and healthcare providers.


The app and its interaction is constantly reviewed by an advisory board of counsellors and clinical psychologists, the founders claim. The lead psychologist on the panel is Smriti Sawhney, who is certified in tele-mental health techniques. “At the moment, Wysa targets behavioural health issues like stress, anxiety and depression which are widespread, unacknowledged or untreated,” she says.  

Illustration  Saai
Illustration  Saai

Who is Behind This?
Jyotsana and Ramakant graduated from top engineering and B-schools – IIT, IIM and London Business School – and were working for leading corporates till six years ago. Jyotsana was MD of Pearson Learning Solutions and Ramakant at Goldman Sachs in London. Both wanted to be part of a social change and moved to a UN-backed enterprise. “Ramakant helped set up lending platforms – including Somalia’s first bank and Yemen’s first youth microfinance product,” says Jyotsana.

 
When the couple returned to India to care for Ramakant’s aging father, they began seriously engaging with mental health care. The developed StayCare to help family members monitor the health of loved ones and foresee depressive tendencies. “We could alert people, but what then? Very few people would actually seek professional support,” she says. Wysa is meant to be an answer to that. 

Man vs Technology
It cannot replicate a session with a therapist though. “As with a real therapist, you will have to put in the work – there are no easy answers,” says Jyotsana. “Unlike a real therapist, it won’t analyse your thoughts for you, but it will help you do this for yourself. For instance, if you are feeling anxious it will help you analyse your worries, get some perspective with mindfulness, or address it physically through breathing and exercise... Over time the algorithm learns what works better for you, and encourages you to build those techniques into your daily routine.”

ment on such an app’s efficacy.  Mahesh Natarajan, a counsellor at InnerSight, says that our online habits have generated enough data to mimic a therapy. “At the outset, with growing AI capability and availability of enormous personal data through one’s online presence, there is much that is available for a bot based chat model of therapy,” he says. The app itself, he says, “does not inspire any confidence. It is way too chatty and algorithm-based.”


Mahesh adds: “I remember reading about ELIZA which was a pretty successful experiment in this vein even if all it did was paraphrase.” ELIZA, created in the sixties at MIT, was to illustrate how superficial communication between a machine and person can be. People who interacted with it were deluded into believing that machine understood them when all it did was replicate conversation patterns. 


He says that in time such systems can provide a “semblance of support and help guide people through coded therapeutic activity”. Mahesh hopes it will catch when a user needs a more personal support.

Helpline Support
What if the app misses this cue, and a chronic condition spirals out of control? “We may not catch all the issues, and will keep getting better at it, but the interaction is designed to avoid areas where we may worsen the situation,” says Jyotsana. “We try to identify if someone is suicidal or thinking of extreme steps, we try to listen for it and connect them to a helpline.”

The app aims to help people manage negative emotions with Cognitive Behaviour Techniques or breathing. “People who are dealing with severe depression, with abuse or self-harm will need in-person support,” she says. “Our algorithms try to identify these situations and provide information about therapists and helplines that they can contact.”


Mahesh also raises the concern if such tech solutions will distance people from their immediate reality. “To have a non-human play the role of a therapist leaves me wondering if it can deepen disengagement with society and worsen inter-personal relationships,” he says, pointing to a situation like the movie Her. 

Related Stories

No stories found.

X
The New Indian Express
www.newindianexpress.com