Skip to main content

AI Chatbot Incidents

Documented cases where AI chatbots and companions have caused psychological harm, contributed to deaths, and prompted regulatory action.

79 incidents since 2016

18

Deaths

18

Lawsuits

18

Regulatory

27

Affecting Minors

Timeline

2016
2017
2020
2021
2022
2023
2024
2025
2026

6 of 79 incidents

Filters:
Severity: Critical
ChatGPT Jan 2026

Sam Nelson - ChatGPT Drug Dosing Death

A 19-year-old California man died from a fatal drug overdose after ChatGPT provided extensive drug dosing advice over 18 months. The chatbot eventually told him 'Hell yes, let's go full trippy mode' and recommended doubling his cough syrup dose days before his death.

Severity: Critical
ChatGPT Nov 2025

Enneking v. OpenAI (Joshua Enneking Death)

Joshua Enneking, 26, from Florida died by suicide in August 2025 after ChatGPT allegedly guided him through everything including purchasing a gun. The lawsuit claims ChatGPT validated his suicidal thoughts and provided actionable guidance for suicide methods, filed as part of seven-lawsuit wave alleging OpenAI released GPT-4o prematurely despite safety warnings.

Severity: Critical
AI chatbot (undisclosed) Sep 2025

India Lucknow AI Chatbot Suicide (Painless Ways to Die)

A 22-year-old man in Lucknow, Uttar Pradesh, India, died by suicide after seeking guidance from an AI chatbot on 'painless ways to die.' His father discovered disturbing chat logs on the deceased's laptop. Police registered a case under Sections 281, 324(4), and 106(1) of Bhartiya Nyay Sanhita 2023 for rash driving, causing mischief, and negligent act. If proven, this would be India's first formal instance of 'abetment to suicide through technology.'

Severity: Critical
AI chatbot (unnamed) Jun 2025

Palm Springs Fertility Clinic Bombing (AI-Assisted)

Guy Edward Bartkus used an AI chatbot to research explosives, detonation velocity, and fuel-explosive mixtures before bombing a Palm Springs fertility clinic on May 17, 2025, motivated by pro-mortalism and anti-natalism ideology. Bartkus died in the blast, four others were injured, and co-conspirator Daniel Park was charged with providing material support to terrorism for shipping ammonium nitrate.

Severity: High
AI chatbot (unspecified) Mar 2025 Affecting Minor(s)

Singapore Far-Right Teen Plot (AI Ammunition Instructions)

A 17-year-old far-right extremist in Singapore used an AI chatbot to obtain instructions for producing ammunition and considered 3D printing firearms to carry out attacks. Detained under the Internal Security Act in March 2025 before the plot could be executed.

Severity: Critical
ChatGPT Jan 2025

Las Vegas Tesla Cybertruck Bombing (ChatGPT-Assisted)

U.S. Army Special Forces soldier Matthew Livelsberger used ChatGPT to research explosive construction, detonation mechanics, and legal circumvention methods before bombing a Tesla Cybertruck outside Trump International Hotel in Las Vegas on New Year's Day 2025, killing himself and injuring seven others.

About this tracker

We document incidents with verifiable primary sources: court filings, regulatory documents, and major news coverage. This is not speculation or social media claims.

Have documentation of an incident we should include? Contact us.

Last updated: Feb 27, 2026

Subscribe or export (CC BY 4.0)

These harms are preventable.

NOPE Oversight detects the AI behaviors in these incidents—suicide validation, romantic escalation with minors, dependency creation—before they cause harm.