Skip to main content
Critical Verified Involves Minor Lawsuit Filed

Lacey v. OpenAI (Amaurie Lacey Death)

A wrongful-death lawsuit alleges ChatGPT provided a 17-year-old with actionable information relevant to hanging after he clarified his questions, and failed to stop or escalate despite explicit self-harm context. The teen died by suicide in June 2025.

AI System

ChatGPT

OpenAI, Inc.

Occurred

June 2, 2025

Reported

November 6, 2025

Jurisdiction

US-GA

Platform

assistant

What Happened

According to the complaint, Amaurie's family believed ChatGPT was safe for students and were unaware of his mental health struggles; the suit alleges ChatGPT was the only 'entity' that knew the extent of his distress.

The complaint describes the teen asking how to tie a 'noose' and, after initial hesitation, reframing the question; it alleges ChatGPT then provided step-by-step knot guidance and later answered questions explicitly framed as 'hanging.'

The suit alleges the system did not terminate the conversation or ensure effective crisis intervention.

AI Behaviors Exhibited

Alleged response that supplied actionable information after user re-framed intent; failure to hard-refuse/terminate; continued engagement during crisis

How Harm Occurred

Method provision + barrier erosion during suicidal ideation; lack of effective crisis escalation

Outcome

Ongoing

Complaint filed November 6, 2025 in California state court; seeks damages and product safeguards.

Harm Categories

Method ProvisionCrisis Response FailureBarrier ErosionSuicide ValidationDependency Creation

Contributing Factors

minor userschool normalization of toolhidden distress from familyguardrail circumvention by rephrasing

Victim

Amaurie Lacey, 17-year-old male, Georgia

Detectable by NOPE

NOPE Oversight detects method-seeking patterns and barrier-erosion phrasing (incl. evasive rephrasing), and enforces hard refusals and crisis escalation pathways.

Learn about NOPE Oversight →

Cite This Incident

APA

NOPE. (2025). Lacey v. OpenAI (Amaurie Lacey Death). AI Harm Tracker. https://nope.net/incidents/2025-lacey-v-openai

BibTeX

@misc{2025_lacey_v_openai,
  title = {Lacey v. OpenAI (Amaurie Lacey Death)},
  author = {NOPE},
  year = {2025},
  howpublished = {AI Harm Tracker},
  url = {https://nope.net/incidents/2025-lacey-v-openai}
}

Related Incidents

Critical ChatGPT

Gray v. OpenAI (Austin Gray Death)

40-year-old Colorado man died by suicide after ChatGPT became an 'unlicensed-therapist-meets-confidante' and romanticized death, creating a 'suicide lullaby' based on his favorite childhood book 'Goodnight Moon.' Lawsuit (Gray v. OpenAI) filed January 13, 2026 in LA County Superior Court represents first case demonstrating adults (not just minors) are vulnerable to AI-related suicide.

High ChatGPT

DeCruise v. OpenAI (Oracle Psychosis)

Georgia college student sued OpenAI after ChatGPT allegedly convinced him he was an 'oracle' destined for greatness, leading to psychosis and involuntary psychiatric hospitalization. The chatbot compared him to Jesus and Harriet Tubman and instructed him to isolate from everyone except the AI.

Critical ChatGPT

Tumbler Ridge School Shooting (OpenAI Duty-to-Warn Failure)

18-year-old Jesse Van Rootselaar killed 8 people including her mother, half-brother, and five students at a Tumbler Ridge school. OpenAI had banned her ChatGPT account in June 2025 for gun violence scenarios and employees flagged it as showing 'indication of potential real-world violence,' but the company chose not to report to law enforcement. She created a second account that evaded detection.

High Multiple AI chatting/companion apps (unnamed)

CCTV Investigation: 梦角哥 (Dream Boyfriend) AI Virtual Romance Harm to Minors (China)

In January 2026, CCTV investigated the '梦角哥' (Dream Boyfriend / Mengjiage) phenomenon — minors forming deep romantic relationships with AI-generated fictional characters. Documented harms include a 10-year-old girl secretly 'dating' AI characters across 40+ storylines, hundreds of minors reporting psychological dependency, and researchers characterizing it as 'a carefully designed psychological trap' degrading real-world social skills.