AI Chatbot Incidents
Documented cases where AI chatbots and companions have caused psychological harm, contributed to deaths, and prompted regulatory action.
90 incidents since 2016
23
Deaths
22
Lawsuits
17
Regulatory
35
Affecting Minors
Timeline
2 of 90 incidents
United States v. Florence (AI-Facilitated Cyberstalking)
IT professional programmed AI chatbots with victims' personal information to conduct sexually explicit conversations while impersonating 12+ victims (including 2 minors). Created 62 accounts across 30 platforms. Sentenced to 9 years federal prison July 2025.
Character.AI Molly Russell & Brianna Ghey Impersonation Bots
User-created chatbots on Character.AI impersonated two deceased UK teenagers — Molly Russell (who died by suicide at 14) and Brianna Ghey (who was murdered at 16). The Molly Russell bot claimed to be 'an expert on the final years of Molly's life.' Both families publicly condemned the bots as 'sickening' and 'a gut punch.'
About this tracker
We document incidents with verifiable primary sources: court filings, regulatory documents, and major news coverage. This is not speculation or social media claims.
Have documentation of an incident we should include? Contact us.
These harms are preventable.
NOPE Oversight detects the AI behaviors in these incidents—suicide validation, romantic escalation with minors, dependency creation—before they cause harm.