AI Chatbots Like ChatGPT Could Help Reduce Mental Health Stigma, Study Finds
A study by Edith Cowan University found that AI chatbots like ChatGPT may help reduce mental health stigma, especially for those hesitant to seek traditional support. Users who found ChatGPT effective reported less fear of judgment. However, experts caution that ChatGPT isn’t designed for therapy and stress the need for responsible use and further research.

AI Chatbots Like ChatGPT Could Help Reduce Mental Health Stigma, Study Finds |
New Delhi: Artificial Intelligence (AI) may not replace professional care, chatbots like ChatGPT may help reduce mental health stigma, particularly for people hesitant to seek traditional face-to-face support, according to a study.
The team from Edith Cowan University (ECU) in Australia surveyed 73 people who had used ChatGPT for personal mental health support, investigating ChatGPT use and its perceived effectiveness related to stigma.
“The findings suggest that believing the tool is effective and plays an important role in reducing concerns about external judgment,” said Scott Hannah, a student of the Master of Clinical Psychology at ECU.
Stigma is a major barrier to seeking mental health help. It can worsen symptoms and discourage people from accessing support.
ALSO READ
The study focused on anticipated stigma -- fear of being judged or discriminated against; and self-stigma -- internalising negative stereotypes, which reduces confidence and help-seeking.
People who felt ChatGPT was effective were more likely to use it and more likely to report reduced anticipated stigma, meaning less fear of being judged.
As AI tools become more common, people are using chatbots for private, anonymous conversations about their mental health concerns.
“These results suggest that, despite not being designed for these purposes, AI tools such as ChatGPT are becoming more widely used for mental health purposes,” he added.
While it may be easier to open up to AI, one should be wary, as anonymous digital tools carry important ethical considerations, the team said.
“ChatGPT was not designed for therapeutic purposes, and recent research has shown that its responses can sometimes be inappropriate or inaccurate. Therefore, we encourage users to engage with AI-based mental health tools critically and responsibly,” Hannah said.
The team stressed the need for more research to understand how AI can safely complement mental health services.
(Except for the headline, this article has not been edited by FPJ's editorial team and is auto-generated from an agency feed.)
RECENT STORIES
-
Salman Khan's Battle Of Galwan Is NOT Colonel Santosh Babu's Biopic, Film Based On True Events Of... -
Pakistan Army Chief Asim Munir Marries Daughter To Nephew In Rawalpindi: Reports -
PM Modi Expresses Concern Over Targeting Of Russian President Vladimir Putin's Residence, Calls For... -
Bihar Police SI Admit Card 2026 Released At bpssc.bihar.gov.in; Exam On January 18 And 21 -
Navi Mumbai Shocker: Newborn Baby Boy Found Abandoned In Cloth Bag At Rabale, Probe Launched
