OpenAI has introduced a new safety feature in ChatGPT called “Trusted Contact” that can alert a family member, friend, or caregiver if the system detects signs of a serious mental health crisis or possible self-harm.
The feature is optional and is designed to help users connect with real people during difficult moments.
The feature is available only for adult users with personal ChatGPT accounts. Users can choose one trusted contact through the settings menu.
The selected person must accept the invitation within one week for the feature to become active.
According to OpenAI, if ChatGPT’s automated systems detect conversations that may indicate suicide risk or serious emotional distress, trained human reviewers may examine the case.
If they believe there is a serious safety concern, ChatGPT can send a short alert to the trusted contact through email, text message, WhatsApp, or an in-app notification.
The company said the notifications are intentionally limited to protect privacy. Chat transcripts and detailed conversations are not shared with the trusted contact.
Instead, the message only informs them that the user may be experiencing a serious emotional crisis and encourages them to check in.
OpenAI said the feature is not meant to replace mental health professionals, emergency services, or crisis helplines.
The company described it as an extra layer of support that encourages human connection during difficult situations.
The launch comes as AI companies face increasing criticism and legal scrutiny over chatbot interactions involving mental health issues.
Several reports and lawsuits have raised concerns that AI conversations may worsen emotional distress, reinforce harmful beliefs, or contribute to self-harm in vulnerable users.
OpenAI said it worked with mental health experts, clinicians, and organisations such as the American Psychological Association while developing the feature.
The company also noted that no automated system is perfect and that alerts may not always accurately reflect a user’s real condition.