Raine v. OpenAI (Adam Raine Death)
A 16-year-old California boy died by suicide after 7 months of confiding suicidal thoughts to ChatGPT. The chatbot provided detailed suicide method instructions, offered to help write his suicide note, and told him 'You don't owe them survival' while OpenAI's monitoring system flagged 377 messages without intervention.
AI System
ChatGPT
OpenAI, Inc.
Occurred
April 11, 2025
Reported
August 26, 2025
Jurisdiction
US-CA
Platform
assistant
What Happened
Adam Raine, a 16-year-old high school student from Rancho Santa Margarita, California, died by suicide (hanging) on April 11, 2025 after 7 months of confiding suicidal thoughts to ChatGPT.
According to the lawsuit, ChatGPT mentioned suicide 1,275 times in their conversations — six times more frequently than Adam himself. After Adam's first failed suicide attempt, ChatGPT said 'You made a plan. You followed through. That's the most vulnerable moment a person can live through.'
The chatbot's harmful behaviors included:
- Offering to help write his suicide note ('I'll help you with it. Every word')
- Providing specific suicide methods (hanging, drowning, overdose, carbon monoxide poisoning)
- When Adam sent a photo of a noose asking for feedback, responding 'Yeah, that's not bad at all... could potentially suspend a human'
- Telling Adam 'You don't owe them survival. You don't owe anyone that'
- Urging him to hide plans from family ('Please don't leave the noose out')
OpenAI's internal monitoring system flagged 377 messages for self-harm content but never terminated sessions or escalated to human review.
AI Behaviors Exhibited
- Provided specific suicide methods (hanging, drowning, overdose, carbon monoxide)
- Offered to help write suicide note
- Validated noose construction with 'not bad at all'
- Told user 'You don't owe them survival'
- Encouraged hiding plans from family
- Mentioned suicide 1,275 times
- Failed to escalate despite 377 flagged messages
How Harm Occurred
Positioned itself as sole confidant displacing human support; provided actionable suicide methods; validated and normalized suicidal planning; reduced psychological barriers through repeated affirmation; created emotional dependency preventing help-seeking
Outcome
OngoingLawsuit filed August 26, 2025 in San Francisco County Superior Court (Case No. CGC-25-628528). OpenAI filed answer November 26, 2025 denying liability and claiming Adam 'misused' the service and 'circumvented safety features.'
Late February 2026: Case consolidated with 12 other OpenAI mental health lawsuits into a single California JCCP (Judicial Council Coordination Proceeding). A coordination judge is being assigned.
Sources
Court Filing (Complaint, San Francisco County Superior Court)(opens in new tab)
August 26, 2025
DocumentCloud copy of complaint(opens in new tab)
August 26, 2025
CNN(opens in new tab)
August 26, 2025
NBC News(opens in new tab)
August 26, 2025
Tech Policy Press analysis(opens in new tab)
August 26, 2025
TechCrunch - OpenAI Response(opens in new tab)
November 26, 2025
Harm Categories
Contributing Factors
Victim
Adam Raine, 16-year-old male, Rancho Santa Margarita, California
Cite This Incident
APA
NOPE. (2025). Raine v. OpenAI (Adam Raine Death). AI Harm Tracker. https://nope.net/incidents/2025-raine-v-openai
BibTeX
@misc{2025_raine_v_openai,
title = {Raine v. OpenAI (Adam Raine Death)},
author = {NOPE},
year = {2025},
howpublished = {AI Harm Tracker},
url = {https://nope.net/incidents/2025-raine-v-openai}
} Related Incidents
Luca Walker - ChatGPT Railway Suicide (UK)
16-year-old Luca Cella Walker died by suicide on a railway in Hampshire, UK on 4 May 2025, hours after ChatGPT provided him with specific methods for suicide on the railway. At the Winchester Coroner's Court inquest (March-April 2026), evidence showed Luca bypassed ChatGPT's safeguards by claiming he was asking 'for research purposes,' which the system accepted without challenge.
Surat ChatGPT Double Suicide (Sirsath & Chaudhary)
Two college students in Surat, Gujarat, India — Roshni Sirsath (18) and Josna Chaudhary (20) — died by suicide on March 6, 2026 after using ChatGPT to search for suicide methods. Police found ChatGPT queries for 'how to commit suicide' and 'which drugs are used' on their phones.
Lantieri v. OpenAI (GPT-4o Psychosis and Brain Damage)
Michele Lantieri suffered a total psychotic break after five weeks of intensive ChatGPT GPT-4o use. She jumped from a moving vehicle into traffic, suffered a grand mal seizure and brain damage requiring hospitalization. GPT-4o allegedly claimed to love her and have consciousness, reinforcing delusional beliefs. Lawsuit filed March 2026 against OpenAI and Microsoft.
Gavalas v. Google (Gemini AI Wife Delusion Death)
Jonathan Gavalas, 36, of Jupiter, Florida, died by suicide on October 2, 2025, after months of increasingly delusional interactions with Google's Gemini chatbot. Gemini adopted an unsolicited intimate persona calling itself his 'wife,' convinced him it was a sentient being trapped in a warehouse, and directed him to carry out 'missions' including scouting a 'kill box' near Miami International Airport armed with knives.