AI-enabled chatbots, the new therapists... But can they replace humans?

AI-enabled chatbots, the new therapists... But can they replace humans?

World over, users are turning to AI-enabled chatbots to confide in. Experts say they cannot replace human therapists, but are accessible and are not judgy

PRUTHA CHAKRABORTYUpdated: Saturday, February 04, 2023, 10:54 PM IST
article-image

It is no secret that hospitality is a stop-gap job. When 25-year-old Missiela found herself trapped in a toxic work culture, she turned to self-help apps to motivate herself. Quite serendipitously, she was led to Wysa – an artificial intelligence-enabled chatbot that helps users cope with anxiety and mental stress.

Previously, she had tried both online and in-person therapy, but without success. “I felt that online therapy took too long… however with Wysa, I see and feel the results much sooner,” the Switzerland-based hospitality professional said.

Initially, Missiela used the chatbot to vent all her negative and discouraging thoughts. When she thought she was ready to speak to a human therapist, she was directed to an Indian professional coach. Missiela agrees she feels “considerably lighter” now.

“Wysa conversations are designed in a way that helps build a therapeutic alliance with its users, similar to that seen with therapists in in-person settings, that help the users trust this space and open up about their concerns,” said Smriti Joshi, chief psychologist at Wysa. “Elements which help with this process are non-judgemental, empathetic listening (by the bot), allowing for unlimited free text input by the user and not respond in just yes or no or with limited options.”

Launched in 2016, Wysa is an Indian startup built to address mental health concerns across the world. The app derived its name from the first chatbot ever created — Eliza. Joaquin Phoenix and Scarlett Johansson’s 2014 movie ‘Her’ was based on it.

Bot for friendship and therapy

It is not uncommon for people to use online chatbots to get help. There is an increasing number of social chatbots around the world that are encouraging people to confide in them. Their common advertising strategy is that – if you are not ready to speak to a human therapist, then try a bot to unleash your worries. But how can a bot respond sensitively to a troubled user, we ask.

Dr Kersi Chavda, consultant, psychiatry, PD Hinduja Hospital and Medical Research Centre, said: “Therapy chatbots are nothing but conversational AI-powered bots designed to help mental health patients. They are programmed with scientifically backed information that is used during these conversations.”

Dr Chavda added that “the ability of chatbots to provide companionship, support, and therapy can lessen the load on therapists. It emerges as an option for people who have problems with accessibility and affordability both in terms of time, distance, and finances”.

SnehAI was launched in 2019 on similar lines. Created with the help of a national non-profit organisation Population Foundation of India, SnehAI is a friendly chatbot for adolescents going through biological changes.

Tejwinder Singh Anand, lead and communications, PFI, said: “SnehAI chatbot is a unique use of artificial intelligence for social good in an edutainment programme in India. Its core function is to provide adolescents with a safe and secure space to seek accurate information on issues such as adolescent health, girls’ reproductive health, safe sex, consent, family planning choices, and online safety amongst others.”

The chatbot delivers multimedia content in the form of stories, GIFs and short videos to make complex concepts easy to understand. Both Wysa and SnehAI maintain user-privacy during these conversations.

The bots do not ask for any personally identifiable information such as name, gender, location, age and email. While a widget on the Snehai.org website allows users to chat as a guest without login, Wysa only asks for a nickname as part of the introductory process.

AI-human friendship

However, these bots are programmed to direct users having serious concerns to a certified, human therapist.

Both SnehAI and Wysa signpost users to vetted crisis helpline numbers for more support. Wysa also has in-house coaches to cater to such needs.

Joshi clarifies that “AI chatbots are not here to replace human psychologists but to augment and improvise existing mental health treatment, improvise detection of mental health concerns leading to early intervention and create awareness and deliver psychoeducation”.

Norwegian researcher and professor at the University of Oslo Petter Bae Brandtzaeg wrote a paper in April 2022 titled, My AI Friend: How Users of a Social Chatbot Understand Their Human-AI Friendship. Published in the peer-reviewed journal Human Communication Research, the study provides new insights about how people understand and experience friendship with AI-powered human-like machines.

Talking about his research, Brandtzaeg said: “A key finding from our own research is that chatbots may be experienced as an important source to social support, mainly because of their low-threshold character and because they are perceived to offer a safe and anonymous space for conversation and confession.”

He said there have been “a number of success stories of people using social chatbots as a form of therapy”. But his findings come bearing a warning sign.

“Human-AI relationships may alter our current understanding of social relationships and raise new questions about what social phenomena entail,” Brandtzaeg informed. “For example, what happens if the conversational AI platform is suddenly altered, updated, hacked, requires a subscription fee, or manipulates humans through conversations and algorithms? What are the fundamental human motivations to engage in Human-AI Relationships, why do some people choose to download a “girlfriend” on their phone, how will such AI impact gender biases, what are the consequences of having 24/7 access to conversations in your pocket, and how will this affect society and future social relationships? We simply don’t know yet.”

Dr Chavda agrees. “In a face-to-face interaction with a therapist, a patient may have a high level of trust to openly discuss conditions. In the digital realm, a patient may self-censor due to the fear of data breaches.

Even if there is a written privacy policy issued by the developer, there are currently no regulations to protect the privacy and security of personal health information,” he added.

Some of the most popular AI chatbots around the world are WoeBot, which uses cognitive behavioural techniques to help people with depression, and Replika, which offers kindness and support.

Experts say chatbots could be the future of intimate relationships in a world full of introverted and closed people. But, they also say, its best to tread cautiously!

RECENT STORIES

Inked Devotion: Explore The Spiritual Tattoo Phenomenon

Inked Devotion: Explore The Spiritual Tattoo Phenomenon

5 Games to Improve Your Focus

5 Games to Improve Your Focus

3 Tips By Meera Gandhi For Attaining Aparigraha Or Non-Greed

3 Tips By Meera Gandhi For Attaining Aparigraha Or Non-Greed

Practical ways For Managing Triggers And Safeguarding Your Well-Being

Practical ways For Managing Triggers And Safeguarding Your Well-Being

5 Beauty Hacks Using Essential Oils

5 Beauty Hacks Using Essential Oils