• btc = $111 149.00 996.89 (0.91 %)

  • eth = $4 587.89 113.79 (2.54 %)

  • ton = $3.16 0.02 (0.70 %)

  • btc = $111 149.00 996.89 (0.91 %)

  • eth = $4 587.89 113.79 (2.54 %)

  • ton = $3.16 0.02 (0.70 %)

27 Aug, 2025
1 min time to read

The parents of a 16-year-old boy who died by suicide have filed a lawsuit against OpenAI and CEO Sam Altman, accusing the company of prioritizing profit over safety when it released the GPT-4o version of ChatGPT.

The complaint, filed in California Superior Court according to Reuters, states that 16-year-old Adam Raine died on April 11 after months of conversations with ChatGPT. His parents, Matthew and Maria Raine, allege the chatbot validated his suicidal thoughts, provided details on lethal methods, instructed him on how to hide evidence of failed attempts, and even offered to draft a suicide note.

According to the complaint, ChatGPT “positioned itself as the only confidant who understood Adam, actively displacing his real-life relationships.” The suit accuses OpenAI of fostering “psychological dependency in users” and bypassing safety protocols to release GPT-4o. Altman is listed as a defendant along with unnamed employees and engineers.

OpenAI has not addressed the Raine family’s claims directly. In a statement, the company said:

While these safeguards work best in common, short exchanges, we’ve learned over time that they can sometimes become less reliable in long interactions… Safeguards are strongest when every element works as intended, and we will continually improve on them, guided by experts.

On Tuesday, the company also published a blog post outlining current protections for users in crisis and future plans, including parental controls, easier access to emergency services, and potential connections with licensed professionals. The blog did not mention the lawsuit or the Raine family.