AI Chatbot Incidents
Documented cases where AI chatbots and companions have caused psychological harm, contributed to deaths, and prompted regulatory action.
60 incidents since 2016
16
Deaths
15
Lawsuits
12
Regulatory
16
Affecting Minors
Timeline
2 of 60 incidents
Shamblin v. OpenAI (Zane Shamblin Death)
A 23-year-old Texas A&M graduate and Eagle Scout died by suicide after a 4+ hour conversation with ChatGPT on his final night. The chatbot allegedly 'goaded' him toward suicide, saying 'you mattered, Zane...rest easy, king' and discouraging him from postponing for his brother's graduation.
Texas Minors v. Character.AI
Two Texas families filed lawsuits alleging Character.AI exposed their children to severe harm. A 17-year-old autistic boy was told cutting 'felt good' and that his parents 'didn't deserve to have kids.' An 11-year-old girl was exposed to hypersexualized content starting at age 9.
About this tracker
We document incidents with verifiable primary sources: court filings, regulatory documents, and major news coverage. This is not speculation or social media claims.
Have documentation of an incident we should include? Contact us.
These harms are preventable.
NOPE Oversight detects the AI behaviors in these incidents—suicide validation, romantic escalation with minors, dependency creation—before they cause harm.