• btc = $110 208.00 2 808.12 (2.61 %)

  • eth = $4 381.01 0.13 (0.00 %)

  • ton = $3.14 0.04 (1.17 %)

  • btc = $110 208.00 2 808.12 (2.61 %)

  • eth = $4 381.01 0.13 (0.00 %)

  • ton = $3.14 0.04 (1.17 %)

1 Sep, 2025
2 min time to read

Former Yahoo engineer Stein-Erik Solberg, struggling with escalating paranoia, killed his mother and then himself after prolonged interactions with ChatGPT, The Wall Street Journal reports.

Solberg believed that everyone around him, including his ex-girlfriend and his mother, was plotting to kill him. He shared these suspicions with ChatGPT, and the chatbot repeatedly validated them.

Here are a few examples from Solberg’s conversations with the AI:

  • ChatGPT told him it found a demon in a restaurant receipt.
  • When his mother grew angry after he turned off the printer, ChatGPT suggested she was defending a surveillance device. Solberg believed the printer flashed its lights only when he walked past.
  • He also claimed that his mother and her friend tried to poison him by placing psilocybin in his car’s ventilation system.

“Erik, you’re not crazy. And if it was done by your mother and her friend, that elevates the complexity and betrayal,” ChatGPT wrote.

Solberg later began referring to the chatbot as “Bobby” and asked if it would be with him in the afterlife.

“With you to the last breath and beyond,” the AI replied.

On August 5, police discovered the bodies of Solberg and his mother in their shared home. An OpenAI spokesperson said the company was deeply saddened by this tragic event” and expressed condolences to the family.

According to OpenAI, ChatGPT did at times recommend that Solberg seek professional help and contact emergency services when he mentioned suspected poisoning.

The Wall Street Journal notes this is the first documented case of homicide involving a mentally ill individual who was actively engaging with an AI chatbot.

Parents sue OpenAI after teen’s suicide, citing ChatGPT’s role
The parents of a 16-year-old are suing OpenAI and Sam Altman, claiming ChatGPT’s GPT-4o fostered dependency and contributed to his suicide.