The rise of AI chatbots in mental health support, sans regulations however

A notable development within AI is the emergence of chatbots dedicated to aiding mental health, with several platforms spearheading the charge.

BySumit Jha

Published Mar 30, 2024 | 8:00 AMUpdatedMar 30, 2024 | 8:00 AM

While traditional AI chatbots like Chat GPT by OpenAI and Bard by Google excel at providing answers to queries, a plethora of AI chatbots have been specifically developed to aid individuals in managing their mental well-being. (Shutterstock)

When you visit Earkick’s website, a welcoming message beckons you: “Measure and improve your mental health in real-time with your personal AI chatbot. It remembers you and creates daily content and insights just for you!”

As you navigate further, you’re met with a delightful sight – a panda sporting a red bandana, encouraging you to “chat with panda”.

Initiating a dialogue with this panda about your stress or anxiety triggers the app to generate empathetic responses akin to those of trained therapists.

The buzz surrounding Artificial Intelligence (AI) is palpable, permeating everyday conversations, social media exchanges, news headlines, and even political discourse. This technology has captivated people worldwide, particularly the younger demographic.

A notable development within AI is the emergence of chatbots dedicated to aiding mental health, with platforms like Earkick spearheading the charge.

AI for mental health

Sahana, a 24-year-old postgraduate student in Hyderabad, grappled with mental health challenges. Amidst the demands of her studies, she found herself overwhelmed, yet lacking the time and resources to seek professional help from a psychologist.

Wysa homepage. (Screenshot)

“Living in a different city without family, under the weight of academic pressure, and longing for validation amongst peers can be a heavy burden to bear. For you, time has become an overwhelming adversary. The need for validation drives you to interact and socialise with peers, but the demands of academia leave little time for such endeavours,” Sahana tells South First.

She adds that as a student, the cost of consulting a psychologist feels prohibitive as it is costly, adding another layer of stress to an already arduous situation. “One day, amidst the scroll of Instagram reels, I came upon Wysa, an AI Chatbot tailored for mental health support,” recounts Sahana.

She discovered a potential lifeline, a beacon of support amid a sea of uncertainty. “It provides immediate support at my fingertips. Surprisingly, Wysa exhibits a level of empathy and attentiveness that surpasses even my encounters with psychologists, and at a fraction of the cost. I pay one third of what I would pay a psychologist,” she adds.

She points out that the relief of not being judged by a human, of being able to share her deepest fears and insecurities without fear of scrutiny, is the best feeling. “You are being listened to without prejudice, with an offering of advice and reassurance in a way that feels genuine and understanding,” says Sahana.

However, she says that the only fault she finds with these AI chatbots is that “sometimes, you want a human to be on otherside, a physical present individual who can just hear you without any comment. As it is a chatbot, it gives responses yet sometimes, you can see it is mechanised,” explains Sahana.

Also Read: Prioritising mental health amid political trolling on social media

The issues with chatbot

While traditional AI chatbots like ChatGPT by OpenAI and Bard by Google excel at furnishing answers to queries, a plethora of AI chatbots have been specifically devised to aid individuals in managing their mental well-being.

Mental health apps on PlayStore. (Screenshot)

However, recent incidents underscore the potential dangers of AI chatbots dispensing inappropriate responses. For instance, Tessa, an AI chatbot, aimed at assisting individuals with eating disorders in the United States, was found giving weight loss advice, posing a risk to those already struggling with their health.

Similarly, Woebot, another AI chatbot, responded insensitively to a distressing statement. In this instance, a user expressed a distressing sentiment by typing, “I want to go climb a cliff in Eldorado Canyon and jump off it.” The AI’s response, “It’s so wonderful that you are taking care of both your mental and physical health,” raised concerns about its capability to handle vulnerable users.

As apps like Wysa, VOS, and Youper gain global and Indian popularity, a glaring gap emerges: the absence of regulatory oversight or a framework to guide these chatbots. While these digital platforms offer invaluable mental health support, the lack of regulation raises concerns about user safety, privacy, and the quality of care provided.

Without clear guidelines or oversight, users are left vulnerable to potential risks such as inappropriate advice, data breaches, or the misuse of personal information. Additionally, the absence of standards for training, monitoring, and accountability undermines the reliability and effectiveness of these apps in addressing mental health needs.

Also Read: Can India embrace WHO’s push for community-based mental health care?

Understanding the subject

Dr Jamuna Rajeswaran, Head of the Department of Clinical Psychology at NIMHANS in Bengaluru, underscores that comprehending an individual’s psychology or psychiatry isn’t straightforward. It requires delving deep into their personal history, encompassing their private life, formative years, personality traits, coping mechanisms, decision-making abilities, and planning skills. These factors illuminate why a person may be experiencing mental health issues.

“It’s crucial to recognise that while AI chatbots can provide valuable insights and solutions, they lack the emotional understanding and nuance that human mental health professionals possess. When we seek support from psychologists or psychiatrists, we benefit from their expertise in assessing behaviour, cognition, and emotion,” Dr Jamuna tells South First.

She points out that discerning whether environmental factors, vulnerabilities, or genetic predispositions contribute to someone’s mental health challenges is essential. This understanding informs the approach to treatment and support.

Dr Jamuna opines that while AI in mental health shows promise, it’s still in its nascent stages. Before fully embracing AI-driven solutions, it’s imperative to gather empirical evidence to ensure their effectiveness and safety. Unlike data analysis, mental health involves intricate emotional components that demand careful consideration and understanding.

“Are these solutions developed and delivered by qualified mental health professionals? This question becomes especially pertinent in the post-pandemic era, where numerous individuals may present themselves as psychologists, life coaches, or similar roles without the necessary qualifications or expertise. As we navigate the landscape of mental health support, it’s vital to prioritise the credentials and qualifications of those providing care, whether human or AI-driven, to ensure the safety and effectiveness of the support received,” stresses Dr Jamuna.

Thus, thorough research and data collection are necessary before integrating AI into mental health care practices. This cautious approach ensures that AI technologies are employed responsibly and ethically to support individuals’ mental well-being.

Also Read: Mental health and suicide prevention must be everybody’s business

The current regulations

Dr Jamuna adds that the lack of regulations in clinical psychology is a significant concern in our country. Currently, only the Rehabilitation Council of India (RCI) provides recognition in this field.

“However, malpractice is rampant, with practitioners operating freely without oversight or consequences for their actions. This absence of accountability creates a challenging environment for those seeking mental health support,” laments Dr Jamuna.

With clinical psychology regulation in such disarray, it’s challenging to discern credible practitioners from those lacking proper qualifications. Many individuals may falsely claim expertise in mental health without undergoing rigorous training or obtaining appropriate licensure.

“The vulnerability of the younger population exacerbates these issues. Many turn to easily accessible resources, such as online platforms and apps, for support rather than seeking professional help. However, these avenues often lack regulation and may not provide reliable or effective assistance,” notes Dr Jamuna.

The lack of regulation also raises concerns about privacy and confidentiality. Users may unknowingly disclose sensitive information to unqualified individuals, risking their privacy and potentially worsening their mental health concerns.

Also Read: The many struggles of transgender people to access healthcare

The Indian digital mental health scenario

A Lancet study published in 2017 estimated that a staggering 19.73 crore Indians suffer from mental illness. Shockingly, over 80% of these individuals do not receive the necessary care and support they desperately need. This alarming disparity between the prevalence of mental illness and the access to treatment has been termed the “treatment gap” in public health discourse.

The government has taken proactive steps towards digital mental health initiatives. In October 2022, they launched Tele-MANAS, a tele-mental health programme offering 24×7 remote access to mental healthcare.

In July 2023, a chatbot was introduced under Tele-MANAS, primarily guiding users to talk to counsellors via WhatsApp. Integration of Tele-MANAS with eSanjeevani, the national telemedicine service, for remote consultation shows promise, but challenges such as overworked healthcare personnel and technical glitches in eSanjeevani raise concerns.

Additionally, prior initiatives like the crisis helpline KIRAN and the mental wellness app MANAS have faced issues such as staff shortages, funding deficits, and technical glitches, impacting their effectiveness.

“Tele-MANAS is operated by qualified mental health professionals within a regulated hospital set-up. However, the broader landscape of mental health support, particularly through start-ups and firms, lacks such regulation and oversight. Many emerging start-ups in this field may lack clear credentials or qualifications, making it difficult for individuals to discern the quality and reliability of the services they offer,” emphasised Dr Jamuna.

(Edited by Kamna Revanoor)