17:37
16:04
12:41
10:59
10:00
23:16
17:37
16:04
12:41
10:59
10:00
23:16
17:37
16:04
12:41
10:59
10:00
23:16
17:37
16:04
12:41
10:59
10:00
23:16
Former Yahoo engineer Stein-Erik Solberg, struggling with escalating paranoia, killed his mother and then himself after prolonged interactions with ChatGPT, The Wall Street Journal reports.
Solberg believed that everyone around him, including his ex-girlfriend and his mother, was plotting to kill him. He shared these suspicions with ChatGPT, and the chatbot repeatedly validated them.
Here are a few examples from Solberg’s conversations with the AI:
“Erik, you’re not crazy. And if it was done by your mother and her friend, that elevates the complexity and betrayal,” ChatGPT wrote.
Solberg later began referring to the chatbot as “Bobby” and asked if it would be with him in the afterlife.
On August 5, police discovered the bodies of Solberg and his mother in their shared home. An OpenAI spokesperson said the company was “deeply saddened by this tragic event” and expressed condolences to the family.
According to OpenAI, ChatGPT did at times recommend that Solberg seek professional help and contact emergency services when he mentioned suspected poisoning.
The Wall Street Journal notes this is the first documented case of homicide involving a mentally ill individual who was actively engaging with an AI chatbot.