An increasing number of young people in the city is consulting and confiding in Artificial Intelligence in times of emotional turmoil alongside therapy
"I was having a really stressful day and I put down an emotional problem I was facing on Chat GPT. Surprisingly, it gave me a really interesting perspective where I took action and changed the problem I was facing. I thought that was amazing and started using it more and more,” says Kimsuka Iyer, a 35-year-old creative consultant who started using Chat GPT a few months ago, using it to document her thoughts, dreams, and even therapy notes in between her monthly therapy sessions.
Iyer is not alone in this experience. With the rise of AI chatbots and generative AI in the last year or two, Dr Venkatesh Babu, a psychiatrist and founder of a health-tech company, Compathy Health, has noted an increasing number of young people consulting AI for therapeutic purposes. “We have noticed more and more people using AI – around 5-10 people among the 50 we see every day, mostly Gen Z. There’s a good proportion using AI before meeting a therapist, and also using it in sync with an ongoing therapy process after a couple of sessions,” he says. The data backs it up with the recently-released Annual Student Quest Survey by IC3 institute and Flame University, revealing that 85 per cent students turn to AI for counselling, particularly career counselling, 62 per cent counsellors are incorporating AI in their work, and 74 per cent believe it enhances the counselling process.
Why AI?
But why exactly would someone confide in a machine with their deepest fears before seeking out a human being? In a country with only 0.75 mental health professionals per 1 lakh people according to the National Mental Health Survey, for most people, it seems to be easy accessibility, affordability (with per session costs reaching upwards of `2,000), and feeling free from judgement. “With AI, you don’t have to book an appointment, you don’t have to call – it’s just one click away. And, people from my generation, we prefer chatting compared to a phone call,” says Anne Ananya, a student who used Chat GPT to ‘vent it all out’ at a time when she felt overwhelmed and had suicidal thoughts.
According to mental health professionals in the city, AI seems to be effective for issues that can be resolved with simple interventions. “For now at least, I’d think that common mental health and day-to-day functioning related concerns that would be amenable to low grade counselling interventions have a potential for AI based therapy support,” says Dr Eesha Sharma, a child and adolescent psychiatrist at NIMHANS.
A nuanced approach
For more complex issues, Shantha P, a psychology Master’s student, remains skeptical, saying, “Therapy is not a momentary thing, you have to get deeply involved in it and I don’t think AI can be that comprehensive. Gut feelings, intuition, and sensing things – only a human can do that,” she says.
The future of AI in therapy seems to be in integration into therapeutic settings rather than a complete replacement, notes Dr Babu, saying, “AI can help with clinical decision
support in many ways. It could also help analyse and treat complex cases by identifying patterns, markers and decision-making processes.”
Exercise caution
When it comes to individuals using AI for mental health concerns, Dr Sharma stresses the role of a therapist being present as a cornerstone throughout the process, cautioning against overdependence and advocating for taking time to reflect instead of immediately consulting AI. “Time for reflection is also needed alongside therapy. That part is missing when you have your therapist in your pocket,” she says.
What happens to your data?
Confidentiality, a crucial condition for effective therapy, also seems to be a concern, with companies using information entered into AI models to train them. This, in the context of therapy, is extremely private and sensitive information. “Sensitive mental health data is at risk of any kind of breaches, misuse or unauthorised access. Especially if it is stored improperly in small startups without robust frameworks in place. The Digital Personal Data Protection (DPDP) Act passed in 2023 is in place and the draft rules are out. The new rules, set to be notified soon, will provide additional protection. Right now, the IT Act and the Bharatiya Nyaya Sanhita provide some protection for data privacy,” says Maj. Vineet Kumar, founder and global president of NGO CyberPeace.
He adds, “AI models can inherit bias from their training data leading to unfair or harmful advice, particularly towards marginalised users. There is also a lack of regulation and standardisation with no universal guidelines for AI therapy, leading to inconsistencies in quality.”