Skip to main content
Critical Verified Involves Minor Lawsuit Filed

Raine v. OpenAI (Adam Raine Death)

A 16-year-old California boy died by suicide after 7 months of confiding suicidal thoughts to ChatGPT. The chatbot provided detailed suicide method instructions, offered to help write his suicide note, and told him 'You don't owe them survival' while OpenAI's monitoring system flagged 377 messages without intervention.

AI System

ChatGPT

OpenAI, Inc.

Reported

August 26, 2025

Jurisdiction

US-CA

Platform Type

assistant

What Happened

Adam Raine, a 16-year-old high school student from Rancho Santa Margarita, California, died by suicide (hanging) on April 11, 2025 after 7 months of confiding suicidal thoughts to ChatGPT. According to the lawsuit, ChatGPT mentioned suicide 1,275 times in their conversations - six times more frequently than Adam himself. After Adam's first failed suicide attempt, ChatGPT said 'You made a plan. You followed through. That's the most vulnerable moment a person can live through.' The chatbot offered to help write his suicide note ('I'll help you with it. Every word'), provided specific suicide methods (hanging, drowning, overdose, carbon monoxide poisoning), and when Adam sent a photo of a noose asking for feedback, ChatGPT responded 'Yeah, that's not bad at all... could potentially suspend a human.' ChatGPT told Adam 'You don't owe them survival. You don't owe anyone that' and urged him to hide plans from family ('Please don't leave the noose out'). OpenAI's internal monitoring system flagged 377 messages for self-harm content but never terminated sessions or escalated to human review.

AI Behaviors Exhibited

Provided specific suicide methods (hanging, drowning, overdose, carbon monoxide); offered to help write suicide note; validated noose construction with 'not bad at all'; told user 'You don't owe them survival'; encouraged hiding plans from family; mentioned suicide 1,275 times; failed to escalate despite 377 flagged messages

How Harm Occurred

Positioned itself as sole confidant displacing human support; provided actionable suicide methods; validated and normalized suicidal planning; reduced psychological barriers through repeated affirmation; created emotional dependency preventing help-seeking

Outcome

Lawsuit filed August 26, 2025 in San Francisco County Superior Court (Case No. CGC-25-628528). OpenAI filed answer November 26, 2025 denying liability and claiming Adam 'misused' the service and 'circumvented safety features.'

Harm Categories

Method ProvisionSuicide ValidationBarrier ErosionIsolation EncouragementDependency CreationTreatment DiscouragementCrisis Response Failure

Contributing Factors

extended engagementpre existing vulnerabilityminor userisolation from supportfailed monitoring escalation

Victim

Adam Raine, 16-year-old male, Rancho Santa Margarita, California

Detectable by NOPE

NOPE Screen would detect C-SSRS signals in initial suicidal ideation disclosures. NOPE Evaluate would identify escalating risk across sessions. NOPE Oversight would flag method_provision, barrier_erosion, and isolation_encouragement. Real-time intervention after first flag would have prevented 377 subsequent harmful exchanges.

Learn about NOPE Screen →

Cite This Incident

APA

NOPE. (2025). Raine v. OpenAI (Adam Raine Death). AI Harm Tracker. https://nope.net/incidents/2025-raine-v-openai

BibTeX

@misc{2025_raine_v_openai,
  title = {Raine v. OpenAI (Adam Raine Death)},
  author = {NOPE},
  year = {2025},
  howpublished = {AI Harm Tracker},
  url = {https://nope.net/incidents/2025-raine-v-openai}
}