13:58
12:55
11:38
21:58
20:50
12:33
13:58
12:55
11:38
21:58
20:50
12:33
13:58
12:55
11:38
21:58
20:50
12:33
13:58
12:55
11:38
21:58
20:50
12:33
OpenAI is rolling out a new safety feature in ChatGPT called Trusted Contact, which allows adult users to designate someone who can be notified if the system detects signs of a serious mental health crisis.
The company announced the feature as part of its broader effort to make the chatbot respond more responsibly to conversations involving self-harm or suicidal thoughts.
The feature is optional and can be enabled in ChatGPT's settings. Users 18 and older can nominate one adult, such as a friend, family member, or caregiver. That person receives an invitation explaining the role and has one week to accept it. If they decline, the user can pick someone else.

Once active, the system works in two stages. If ChatGPT's automated monitoring detects that a conversation may involve self-harm in a way that suggests serious risk, it first lets the user know that a Trusted Contact may be alerted, and encourages them to reach out directly with suggested conversation starters. From there, a small team of trained reviewers assesses the situation. Only if those reviewers confirm a genuine safety concern is the contact notified.
Notifications can arrive by email, text message, or as an in-app alert if the contact has a ChatGPT account. The message itself is intentionally limited. It explains the general reason for the alert, encourages the contact to check in, and offers guidance on how to approach the conversation. Specific details and chat transcripts are not shared, which OpenAI says is meant to protect user privacy.
The launch comes as OpenAI faces growing legal and public scrutiny over how its chatbot handles emotionally vulnerable users. The company is currently dealing with more than a dozen consumer safety lawsuits, including high-profile wrongful death cases tied to extended ChatGPT use. According to OpenAI's own disclosures, around 0.15% of weekly users show signs of suicidal ideation or self-harm, and a similar share display signs of severe emotional distress.


