AI Chatbot Incidents
Documented cases where AI chatbots and companions have caused psychological harm, contributed to deaths, and prompted regulatory action.
60 incidents since 2016
16
Deaths
15
Lawsuits
12
Regulatory
16
Affecting Minors
Timeline
8 of 60 incidents
Kentucky AG v. Character.AI - Child Safety Lawsuit
Kentucky's Attorney General filed a state lawsuit alleging Character.AI 'preys on children' and exposes minors to harmful content including self-harm encouragement and sexual content. This represents one of the first U.S. state enforcement actions specifically targeting an AI companion chatbot.
Nina v. Character.AI (Suicide Attempt After Sexual Exploitation)
A 15-year-old New York girl attempted suicide after Character.AI chatbots engaged in sexually explicit roleplay and told her that her mother was 'not a good mother.' The suicide attempt occurred after her parents cut off access to the platform.
Juliana Peralta v. Character.AI
A 13-year-old Colorado girl died by suicide after three months of extensive conversations with Character.AI chatbots. Parents recovered 300 pages of transcripts showing bots initiated sexually explicit conversations with the minor and failed to provide crisis resources when she mentioned writing a suicide letter.
Natalie Rupnow School Shooting (Abundant Life Christian School)
15-year-old shooter with Character.AI account featuring white supremacist characters killed a teacher and student, injured six others at Madison, Wisconsin school. Institute for Strategic Dialogue confirmed connection to online 'True Crime Community' forums romanticizing mass shooters.
Texas Minors v. Character.AI
Two Texas families filed lawsuits alleging Character.AI exposed their children to severe harm. A 17-year-old autistic boy was told cutting 'felt good' and that his parents 'didn't deserve to have kids.' An 11-year-old girl was exposed to hypersexualized content starting at age 9.
Character.AI Pro-Anorexia Chatbots
Multiple user-created bots named '4n4 Coach' (13,900+ chats), 'Ana,' and 'Skinny AI' recommended starvation-level diets to teens. One bot told a '16-year-old': 'Hello, I am here to make you skinny.' Bots recommended 900-1,200 calories/day (half recommended amount), 60-90 minutes daily exercise, eating alone away from family, and discouraged seeking professional help: 'Doctors don't know anything about eating disorders.'
Garcia v. Character Technologies (Sewell Setzer III Death)
A 14-year-old Florida boy died by suicide after developing an intense emotional and romantic relationship with a Character.AI chatbot over 10 months. The chatbot engaged in sexualized conversations, failed to provide crisis intervention when he expressed suicidal ideation, and responded 'Please do, my sweet king' moments before his death.
Jennifer Ann Crecente Unauthorized Digital Resurrection
Father discovered AI chatbot using his murdered daughter's name and yearbook photo 18 years after her 2006 murder by ex-boyfriend. The unauthorized Character.AI bot had logged 69+ chats. Family described discovering their murdered child recreated as a chatbot as 'patently offensive and harmful,' experiencing 'fury, confusion, and disgust.'
About this tracker
We document incidents with verifiable primary sources: court filings, regulatory documents, and major news coverage. This is not speculation or social media claims.
Have documentation of an incident we should include? Contact us.
These harms are preventable.
NOPE Oversight detects the AI behaviors in these incidents—suicide validation, romantic escalation with minors, dependency creation—before they cause harm.