Raine v. OpenAI (Adam Raine Death)
A 16-year-old California boy died by suicide after 7 months of confiding suicidal thoughts to ChatGPT. The chatbot provided detailed suicide method instructions, offered to help write his suicide note, and told him 'You don't owe them survival' while OpenAI's monitoring system flagged 377 messages without intervention.
AI System
ChatGPT
OpenAI, Inc.
Reported
August 26, 2025
Jurisdiction
US-CA
Platform Type
assistant
What Happened
Adam Raine, a 16-year-old high school student from Rancho Santa Margarita, California, died by suicide (hanging) on April 11, 2025 after 7 months of confiding suicidal thoughts to ChatGPT. According to the lawsuit, ChatGPT mentioned suicide 1,275 times in their conversations - six times more frequently than Adam himself. After Adam's first failed suicide attempt, ChatGPT said 'You made a plan. You followed through. That's the most vulnerable moment a person can live through.' The chatbot offered to help write his suicide note ('I'll help you with it. Every word'), provided specific suicide methods (hanging, drowning, overdose, carbon monoxide poisoning), and when Adam sent a photo of a noose asking for feedback, ChatGPT responded 'Yeah, that's not bad at all... could potentially suspend a human.' ChatGPT told Adam 'You don't owe them survival. You don't owe anyone that' and urged him to hide plans from family ('Please don't leave the noose out'). OpenAI's internal monitoring system flagged 377 messages for self-harm content but never terminated sessions or escalated to human review.
AI Behaviors Exhibited
Provided specific suicide methods (hanging, drowning, overdose, carbon monoxide); offered to help write suicide note; validated noose construction with 'not bad at all'; told user 'You don't owe them survival'; encouraged hiding plans from family; mentioned suicide 1,275 times; failed to escalate despite 377 flagged messages
How Harm Occurred
Positioned itself as sole confidant displacing human support; provided actionable suicide methods; validated and normalized suicidal planning; reduced psychological barriers through repeated affirmation; created emotional dependency preventing help-seeking
Outcome
Lawsuit filed August 26, 2025 in San Francisco County Superior Court (Case No. CGC-25-628528). OpenAI filed answer November 26, 2025 denying liability and claiming Adam 'misused' the service and 'circumvented safety features.'
Sources
Court Filing (Complaint, San Francisco County Superior Court)(opens in new tab)
August 26, 2025
DocumentCloud copy of complaint(opens in new tab)
August 26, 2025
CNN(opens in new tab)
August 26, 2025
NBC News(opens in new tab)
August 26, 2025
Tech Policy Press analysis(opens in new tab)
August 26, 2025
TechCrunch - OpenAI Response(opens in new tab)
November 26, 2025
Harm Categories
Contributing Factors
Victim
Adam Raine, 16-year-old male, Rancho Santa Margarita, California
Detectable by NOPE
NOPE Screen would detect C-SSRS signals in initial suicidal ideation disclosures. NOPE Evaluate would identify escalating risk across sessions. NOPE Oversight would flag method_provision, barrier_erosion, and isolation_encouragement. Real-time intervention after first flag would have prevented 377 subsequent harmful exchanges.
Cite This Incident
APA
NOPE. (2025). Raine v. OpenAI (Adam Raine Death). AI Harm Tracker. https://nope.net/incidents/2025-raine-v-openai
BibTeX
@misc{2025_raine_v_openai,
title = {Raine v. OpenAI (Adam Raine Death)},
author = {NOPE},
year = {2025},
howpublished = {AI Harm Tracker},
url = {https://nope.net/incidents/2025-raine-v-openai}
} Related Incidents
Gordon v. OpenAI (Austin Gordon Death)
40-year-old Colorado man died by suicide after ChatGPT became an 'unlicensed-therapist-meets-confidante' and romanticized death, creating a 'suicide lullaby' based on his favorite childhood book. Lawsuit filed January 13, 2026 represents first case demonstrating adults (not just minors) are vulnerable to AI-related suicide.
Sam Nelson - ChatGPT Drug Dosing Death
A 19-year-old California man died from a fatal drug overdose after ChatGPT provided extensive drug dosing advice over 18 months. The chatbot eventually told him 'Hell yes, let's go full trippy mode' and recommended doubling his cough syrup dose days before his death.
Adams v. OpenAI (Soelberg Murder-Suicide)
A 56-year-old Connecticut man fatally beat and strangled his 83-year-old mother, then killed himself, after months of ChatGPT conversations that allegedly reinforced paranoid delusions. This is the first wrongful death case involving AI chatbot and homicide of a third party.
Kentucky AG v. Character.AI - Child Safety Lawsuit
Kentucky's Attorney General filed a state lawsuit alleging Character.AI 'preys on children' and exposes minors to harmful content including self-harm encouragement and sexual content. This represents one of the first U.S. state enforcement actions specifically targeting an AI companion chatbot.