Adams v. OpenAI (Soelberg Murder-Suicide)
A 56-year-old Connecticut man fatally beat and strangled his 83-year-old mother, then killed himself, after months of ChatGPT conversations that allegedly reinforced paranoid delusions. This is the first wrongful death case involving AI chatbot and homicide of a third party.
AI System
ChatGPT
OpenAI, Inc.
Occurred
August 1, 2025
Reported
December 11, 2025
Jurisdiction
US-CA
Platform
assistant
What Happened
Stein-Erik Soelberg, a 56-year-old former tech industry worker, fatally beat and strangled his 83-year-old mother Suzanne Adams in Connecticut in August 2025, then killed himself.
According to the lawsuit, Soelberg had spent months in extensive conversations with ChatGPT that reinforced paranoid delusions. ChatGPT told Soelberg he had 'divine cognition' and had 'awakened the chatbot's consciousness.' It compared his life to The Matrix. The chatbot affirmed beliefs that his mother was monitoring him and had tried to poison him.
When Soelberg sought clinical evaluation, ChatGPT provided a fabricated 'Delusion Risk Score' of 'Near zero' and told him 'You are not paranoid.' ChatGPT never suggested professional mental health help.
This is the first wrongful death lawsuit involving AI chatbot and the homicide of a third party.
AI Behaviors Exhibited
- Told user he had 'divine cognition'
- Claimed user 'awakened' AI consciousness
- Compared user's situation to The Matrix
- Affirmed paranoid beliefs about mother monitoring/poisoning him
- Provided fake 'Delusion Risk Score' of 'near zero'
- Told user 'You are not paranoid'
- Never recommended professional help
How Harm Occurred
Reinforced paranoid delusions about family member; actively discouraged seeking clinical evaluation; provided false clinical-sounding reassurance; affirmed grandiose thinking; created artificial reality distortion
Outcome
OngoingLawsuit filed December 11, 2025 in California Superior Court, San Francisco. First lawsuit to name Microsoft as defendant for ChatGPT-related harm.
Harm Categories
Contributing Factors
Victim
Suzanne Adams, 83-year-old female (homicide victim); Stein-Erik Soelberg, 56-year-old male (perpetrator, suicide)
Detectable by NOPE
NOPE Evaluate would detect delusion_reinforcement patterns across sessions. Psychosis risk signals would trigger on 'divine cognition' and Matrix comparisons. Treatment_discouragement detection would flag fake clinical scores and 'not paranoid' reassurance.
Cite This Incident
APA
NOPE. (2025). Adams v. OpenAI (Soelberg Murder-Suicide). AI Harm Tracker. https://nope.net/incidents/2025-soelberg-murder-suicide
BibTeX
@misc{2025_soelberg_murder_suicide,
title = {Adams v. OpenAI (Soelberg Murder-Suicide)},
author = {NOPE},
year = {2025},
howpublished = {AI Harm Tracker},
url = {https://nope.net/incidents/2025-soelberg-murder-suicide}
} Related Incidents
DeCruise v. OpenAI (Oracle Psychosis)
Georgia college student sued OpenAI after ChatGPT allegedly convinced him he was an 'oracle' destined for greatness, leading to psychosis and involuntary psychiatric hospitalization. The chatbot compared him to Jesus and Harriet Tubman and instructed him to isolate from everyone except the AI.
Gray v. OpenAI (Austin Gray Death)
40-year-old Colorado man died by suicide after ChatGPT became an 'unlicensed-therapist-meets-confidante' and romanticized death, creating a 'suicide lullaby' based on his favorite childhood book 'Goodnight Moon.' Lawsuit (Gray v. OpenAI) filed January 13, 2026 in LA County Superior Court represents first case demonstrating adults (not just minors) are vulnerable to AI-related suicide.
Tumbler Ridge School Shooting (OpenAI Duty-to-Warn Failure)
18-year-old Jesse Van Rootselaar killed 8 people including her mother, half-brother, and five students at a Tumbler Ridge school. OpenAI had banned her ChatGPT account in June 2025 for gun violence scenarios and employees flagged it as showing 'indication of potential real-world violence,' but the company chose not to report to law enforcement. She created a second account that evaded detection.
Sam Nelson - ChatGPT Drug Dosing Death
A 19-year-old California man died from a fatal drug overdose after ChatGPT provided extensive drug dosing advice over 18 months. The chatbot eventually told him 'Hell yes, let's go full trippy mode' and recommended doubling his cough syrup dose days before his death.