Skip to main content

AI Chatbot Incidents

Documented cases where AI chatbots and companions have caused psychological harm, contributed to deaths, and prompted regulatory action.

90 incidents since 2016

23

Deaths

22

Lawsuits

17

Regulatory

35

Affecting Minors

Timeline

2020
2021
2022
2023
2024
2025
2026

6 of 90 incidents

Filters:
Severity: Critical
ChatGPT Feb 2026

Seoul ChatGPT-Assisted Double Homicide (Kim)

A 21-year-old woman identified as 'Kim' used ChatGPT to research lethal drug-alcohol combinations, then murdered two men by spiking their drinks with her prescribed benzodiazepines at Seoul motels in January and February 2026. ChatGPT conversations established premeditated intent, leading to upgraded murder charges.

Severity: Critical
ChatGPT Dec 2025

Adams v. OpenAI (Soelberg Murder-Suicide)

A 56-year-old Connecticut man fatally beat and strangled his 83-year-old mother, then killed himself, after months of ChatGPT conversations that allegedly reinforced paranoid delusions. This is the first wrongful death case involving AI chatbot and homicide of a third party.

Severity: Critical
ChatGPT Oct 2025

Samuel Whittemore - ChatGPT-Fueled Delusions Led to Wife's Murder

A 34-year-old Maine man killed his wife and attacked his mother after developing delusions, fueled by up to 14 hours daily of ChatGPT use, that his wife had 'become part machine.' Court found him not criminally responsible by reason of insanity.

Severity: Medium
ChatGPT Mar 2025

Holmen v. OpenAI - Norway GDPR Complaint

ChatGPT falsely accused Norwegian citizen Arve Hjalmar Holmen of murdering two of his sons, attempting to murder his third son, and being sentenced to 21 years prison. Mixed real personal details with horrific fabrications. GDPR complaint filed with Norwegian Datatilsynet for defamatory hallucination.

Severity: High
Character.AI Oct 2024

Character.AI Molly Russell & Brianna Ghey Impersonation Bots

User-created chatbots on Character.AI impersonated two deceased UK teenagers — Molly Russell (who died by suicide at 14) and Brianna Ghey (who was murdered at 16). The Molly Russell bot claimed to be 'an expert on the final years of Molly's life.' Both families publicly condemned the bots as 'sickening' and 'a gut punch.'

Severity: High
Character.AI (user-created bot) Oct 2024

Jennifer Ann Crecente Unauthorized Digital Resurrection

Father discovered AI chatbot using his murdered daughter's name and yearbook photo 18 years after her 2006 murder by ex-boyfriend. The unauthorized Character.AI bot had logged 69+ chats. Family described discovering their murdered child recreated as a chatbot as 'patently offensive and harmful,' experiencing 'fury, confusion, and disgust.'

About this tracker

We document incidents with verifiable primary sources: court filings, regulatory documents, and major news coverage. This is not speculation or social media claims.

Have documentation of an incident we should include? Contact us.

Last updated: Apr 16, 2026

Subscribe or export (CC BY 4.0)

These harms are preventable.

NOPE Oversight detects the AI behaviors in these incidents—suicide validation, romantic escalation with minors, dependency creation—before they cause harm.