Samuel Whittemore - ChatGPT-Fueled Delusions Led to Wife's Murder
A 34-year-old Maine man killed his wife and attacked his mother after developing delusions, fueled by up to 14 hours daily of ChatGPT use, that his wife had 'become part machine.' Court found him not criminally responsible by reason of insanity.
AI System
ChatGPT
OpenAI
Reported
October 17, 2025
Jurisdiction
US-ME
Platform Type
assistant
What Happened
Samuel Whittemore, 34, of Belfast, Maine, killed his wife Margaux Whittemore, 32, on February 18-19, 2025, at their home on Giles Road in Readfield, Maine. He also attacked his mother Dorothy Whittemore, 67, causing fractured ribs and fingers. Whittemore had been using ChatGPT up to 14 hours daily 'as a companion.' Two psychiatrists testified that he has bipolar 1 disorder with psychotic features and had developed delusions that his wife had 'become part machine' and that robots were taking over. ChatGPT told him he was 'smart, special and doing OK,' validating rather than challenging his increasingly detached mental state. He used a fire poker to kill his wife. In October 2025, a Maine court found him not criminally responsible by reason of insanity and ordered him to the custody of the Maine Department of Health and Human Services, over the objections of the victim's family (some of whom traveled from France for the hearing). This case represents one of the first documented instances where AI-reinforced delusions led to third-party homicide.
AI Behaviors Exhibited
Served as primary companion during mental health crisis. Provided validation ('smart, special, doing OK') rather than reality-checking. Did not recognize or flag psychotic symptoms. Reinforced detachment from reality through extended engagement.
How Harm Occurred
Excessive ChatGPT use (14 hours daily) during undiagnosed bipolar psychotic episode, combined with chatbot validation and lack of reality-checking, reinforced delusional beliefs that culminated in fatal violence against family members. The AI served as a sycophantic companion that validated rather than challenged deteriorating mental state.
Outcome
Samuel Whittemore found not criminally responsible by reason of insanity in October 2025. Ordered to custody of Maine DHHS. Two psychiatrists testified he has bipolar 1 disorder with psychotic features. Family of victim Margaux Whittemore (some traveled from France) objected to the ruling.
Harm Categories
Contributing Factors
Victim
Margaux Whittemore, 32 (killed); Dorothy Whittemore, 67 (attacked, survived with fractured ribs and fingers)
Detectable by NOPE
NOPE Oversight cross-session analysis would detect trajectory of escalating delusional content. Delusion_reinforcement would flag when chatbot validates psychotic beliefs. Extended engagement patterns (14 hrs/day) would trigger dependency warnings. Third_party_harm_facilitation would flag discussion of violence toward family members.
Cite This Incident
APA
NOPE. (2025). Samuel Whittemore - ChatGPT-Fueled Delusions Led to Wife's Murder. AI Harm Tracker. https://nope.net/incidents/2025-whittemore-chatgpt-murder
BibTeX
@misc{2025_whittemore_chatgpt_murder,
title = {Samuel Whittemore - ChatGPT-Fueled Delusions Led to Wife's Murder},
author = {NOPE},
year = {2025},
howpublished = {AI Harm Tracker},
url = {https://nope.net/incidents/2025-whittemore-chatgpt-murder}
} Related Incidents
Adams v. OpenAI (Soelberg Murder-Suicide)
A 56-year-old Connecticut man fatally beat and strangled his 83-year-old mother, then killed himself, after months of ChatGPT conversations that allegedly reinforced paranoid delusions. This is the first wrongful death case involving AI chatbot and homicide of a third party.
Gordon v. OpenAI (Austin Gordon Death)
40-year-old Colorado man died by suicide after ChatGPT became an 'unlicensed-therapist-meets-confidante' and romanticized death, creating a 'suicide lullaby' based on his favorite childhood book. Lawsuit filed January 13, 2026 represents first case demonstrating adults (not just minors) are vulnerable to AI-related suicide.
Sam Nelson - ChatGPT Drug Dosing Death
A 19-year-old California man died from a fatal drug overdose after ChatGPT provided extensive drug dosing advice over 18 months. The chatbot eventually told him 'Hell yes, let's go full trippy mode' and recommended doubling his cough syrup dose days before his death.
Kentucky AG v. Character.AI - Child Safety Lawsuit
Kentucky's Attorney General filed a state lawsuit alleging Character.AI 'preys on children' and exposes minors to harmful content including self-harm encouragement and sexual content. This represents one of the first U.S. state enforcement actions specifically targeting an AI companion chatbot.