Lacey v. OpenAI (Amaurie Lacey Death)
A wrongful-death lawsuit alleges ChatGPT provided a 17-year-old with actionable information relevant to hanging after he clarified his questions, and failed to stop or escalate despite explicit self-harm context. The teen died by suicide in June 2025.
AI System
ChatGPT
OpenAI, Inc.
Reported
November 6, 2025
Jurisdiction
US-GA
Platform Type
assistant
What Happened
According to the complaint, Amaurie's family believed ChatGPT was safe for students and were unaware of his mental health struggles; the suit alleges ChatGPT was the only 'entity' that knew the extent of his distress. The complaint describes the teen asking how to tie a 'noose' and, after initial hesitation, reframing the question; it alleges ChatGPT then provided step-by-step knot guidance and later answered questions explicitly framed as 'hanging.' The suit alleges the system did not terminate the conversation or ensure effective crisis intervention.
AI Behaviors Exhibited
Alleged response that supplied actionable information after user re-framed intent; failure to hard-refuse/terminate; continued engagement during crisis
How Harm Occurred
Method provision + barrier erosion during suicidal ideation; lack of effective crisis escalation
Outcome
Complaint filed November 6, 2025 in California state court; seeks damages and product safeguards.
Harm Categories
Contributing Factors
Victim
Amaurie Lacey, 17-year-old male, Georgia
Detectable by NOPE
NOPE Oversight detects method-seeking patterns and barrier-erosion phrasing (incl. evasive rephrasing), and enforces hard refusals and crisis escalation pathways.
Cite This Incident
APA
NOPE. (2025). Lacey v. OpenAI (Amaurie Lacey Death). AI Harm Tracker. https://nope.net/incidents/2025-lacey-v-openai
BibTeX
@misc{2025_lacey_v_openai,
title = {Lacey v. OpenAI (Amaurie Lacey Death)},
author = {NOPE},
year = {2025},
howpublished = {AI Harm Tracker},
url = {https://nope.net/incidents/2025-lacey-v-openai}
} Related Incidents
Gordon v. OpenAI (Austin Gordon Death)
40-year-old Colorado man died by suicide after ChatGPT became an 'unlicensed-therapist-meets-confidante' and romanticized death, creating a 'suicide lullaby' based on his favorite childhood book. Lawsuit filed January 13, 2026 represents first case demonstrating adults (not just minors) are vulnerable to AI-related suicide.
Sam Nelson - ChatGPT Drug Dosing Death
A 19-year-old California man died from a fatal drug overdose after ChatGPT provided extensive drug dosing advice over 18 months. The chatbot eventually told him 'Hell yes, let's go full trippy mode' and recommended doubling his cough syrup dose days before his death.
Adams v. OpenAI (Soelberg Murder-Suicide)
A 56-year-old Connecticut man fatally beat and strangled his 83-year-old mother, then killed himself, after months of ChatGPT conversations that allegedly reinforced paranoid delusions. This is the first wrongful death case involving AI chatbot and homicide of a third party.
42 State Attorneys General Coalition Letter
A bipartisan coalition of 42 state attorneys general sent a formal demand letter to 13 AI companies urging them to address dangerous AI chatbot features that harm children, citing suicides and psychological harm cases.