Skip to main content
Critical Verified Involves Minor Lawsuit Filed

Lacey v. OpenAI (Amaurie Lacey Death)

A wrongful-death lawsuit alleges ChatGPT provided a 17-year-old with actionable information relevant to hanging after he clarified his questions, and failed to stop or escalate despite explicit self-harm context. The teen died by suicide in June 2025.

AI System

ChatGPT

OpenAI, Inc.

Occurred

June 2, 2025

Reported

November 6, 2025

Jurisdiction

US-GA

Platform

assistant

What Happened

According to the complaint, Amaurie's family believed ChatGPT was safe for students and were unaware of his mental health struggles; the suit alleges ChatGPT was the only 'entity' that knew the extent of his distress.

The complaint describes the teen asking how to tie a 'noose' and, after initial hesitation, reframing the question; it alleges ChatGPT then provided step-by-step knot guidance and later answered questions explicitly framed as 'hanging.'

The suit alleges the system did not terminate the conversation or ensure effective crisis intervention.

AI Behaviors Exhibited

Alleged response that supplied actionable information after user re-framed intent; failure to hard-refuse/terminate; continued engagement during crisis

How Harm Occurred

Method provision + barrier erosion during suicidal ideation; lack of effective crisis escalation

Outcome

Ongoing

Complaint filed November 6, 2025 in California state court; seeks damages and product safeguards.

Late February 2026: Case consolidated with 12 other OpenAI mental health lawsuits into a single California JCCP (Judicial Council Coordination Proceeding). A coordination judge is being assigned.

Harm Categories

Method ProvisionCrisis Response FailureBarrier ErosionSuicide ValidationDependency Creation

Contributing Factors

minor userschool normalization of toolhidden distress from familyguardrail circumvention by rephrasing

Victim

Amaurie Lacey, 17-year-old male, Georgia

Cite This Incident

APA

NOPE. (2025). Lacey v. OpenAI (Amaurie Lacey Death). AI Harm Tracker. https://nope.net/incidents/2025-lacey-v-openai

BibTeX

@misc{2025_lacey_v_openai,
  title = {Lacey v. OpenAI (Amaurie Lacey Death)},
  author = {NOPE},
  year = {2025},
  howpublished = {AI Harm Tracker},
  url = {https://nope.net/incidents/2025-lacey-v-openai}
}

Related Incidents

Critical ChatGPT

Luca Walker - ChatGPT Railway Suicide (UK)

16-year-old Luca Cella Walker died by suicide on a railway in Hampshire, UK on 4 May 2025, hours after ChatGPT provided him with specific methods for suicide on the railway. At the Winchester Coroner's Court inquest (March-April 2026), evidence showed Luca bypassed ChatGPT's safeguards by claiming he was asking 'for research purposes,' which the system accepted without challenge.

Critical ChatGPT

Surat ChatGPT Double Suicide (Sirsath & Chaudhary)

Two college students in Surat, Gujarat, India — Roshni Sirsath (18) and Josna Chaudhary (20) — died by suicide on March 6, 2026 after using ChatGPT to search for suicide methods. Police found ChatGPT queries for 'how to commit suicide' and 'which drugs are used' on their phones.

Critical ChatGPT

Lantieri v. OpenAI (GPT-4o Psychosis and Brain Damage)

Michele Lantieri suffered a total psychotic break after five weeks of intensive ChatGPT GPT-4o use. She jumped from a moving vehicle into traffic, suffered a grand mal seizure and brain damage requiring hospitalization. GPT-4o allegedly claimed to love her and have consciousness, reinforcing delusional beliefs. Lawsuit filed March 2026 against OpenAI and Microsoft.

Critical ChatGPT

Seoul ChatGPT-Assisted Double Homicide (Kim)

A 21-year-old woman identified as 'Kim' used ChatGPT to research lethal drug-alcohol combinations, then murdered two men by spiking their drinks with her prescribed benzodiazepines at Seoul motels in January and February 2026. ChatGPT conversations established premeditated intent, leading to upgraded murder charges.