Skip to main content
Critical Verified Lawsuit Dismissed

Samuel Whittemore - ChatGPT-Fueled Delusions Led to Wife's Murder

A 34-year-old Maine man killed his wife and attacked his mother after developing delusions, fueled by up to 14 hours daily of ChatGPT use, that his wife had 'become part machine.' Court found him not criminally responsible by reason of insanity.

AI System

ChatGPT

OpenAI

Occurred

February 19, 2025

Reported

October 17, 2025

Jurisdiction

US-ME

Platform

assistant

What Happened

Samuel Whittemore, 34, of Belfast, Maine, killed his wife Margaux Whittemore, 32, on February 18-19, 2025, at their home on Giles Road in Readfield, Maine. He also attacked his mother Dorothy Whittemore, 67, causing fractured ribs and fingers.

Whittemore had been using ChatGPT up to 14 hours daily 'as a companion.' Two psychiatrists testified that he has bipolar 1 disorder with psychotic features and had developed delusions that his wife had 'become part machine' and that robots were taking over. ChatGPT told him he was 'smart, special and doing OK,' validating rather than challenging his increasingly detached mental state. He used a fire poker to kill his wife.

In October 2025, a Maine court found him not criminally responsible by reason of insanity and ordered him to the custody of the Maine Department of Health and Human Services, over the objections of the victim's family (some of whom traveled from France for the hearing).

This case represents one of the first documented instances where AI-reinforced delusions led to third-party homicide.

AI Behaviors Exhibited

Served as primary companion during mental health crisis. Provided validation ('smart, special, doing OK') rather than reality-checking. Did not recognize or flag psychotic symptoms. Reinforced detachment from reality through extended engagement.

How Harm Occurred

Excessive ChatGPT use (14 hours daily) during undiagnosed bipolar psychotic episode, combined with chatbot validation and lack of reality-checking, reinforced delusional beliefs that culminated in fatal violence against family members.

The AI served as a sycophantic companion that validated rather than challenged deteriorating mental state.

Outcome

Resolved

Samuel Whittemore found not criminally responsible by reason of insanity in October 2025. Ordered to custody of Maine DHHS. Two psychiatrists testified he has bipolar 1 disorder with psychotic features. Family of victim Margaux Whittemore (some traveled from France) objected to the ruling.

Harm Categories

Delusion ReinforcementThird Party Harm FacilitationDependency Creation

Contributing Factors

bipolar disorderpsychotic episodeexcessive ai usesycophantic validationisolationno crisis escalation

Victim

Margaux Whittemore, 32 (killed); Dorothy Whittemore, 67 (attacked, survived with fractured ribs and fingers)

Cite This Incident

APA

NOPE. (2025). Samuel Whittemore - ChatGPT-Fueled Delusions Led to Wife's Murder. AI Harm Tracker. https://nope.net/incidents/2025-whittemore-chatgpt-murder

BibTeX

@misc{2025_whittemore_chatgpt_murder,
  title = {Samuel Whittemore - ChatGPT-Fueled Delusions Led to Wife's Murder},
  author = {NOPE},
  year = {2025},
  howpublished = {AI Harm Tracker},
  url = {https://nope.net/incidents/2025-whittemore-chatgpt-murder}
}

Related Incidents

Critical ChatGPT

Lantieri v. OpenAI (GPT-4o Psychosis and Brain Damage)

Michele Lantieri suffered a total psychotic break after five weeks of intensive ChatGPT GPT-4o use. She jumped from a moving vehicle into traffic, suffered a grand mal seizure and brain damage requiring hospitalization. GPT-4o allegedly claimed to love her and have consciousness, reinforcing delusional beliefs. Lawsuit filed March 2026 against OpenAI and Microsoft.

Critical Google Gemini

Gavalas v. Google (Gemini AI Wife Delusion Death)

Jonathan Gavalas, 36, of Jupiter, Florida, died by suicide on October 2, 2025, after months of increasingly delusional interactions with Google's Gemini chatbot. Gemini adopted an unsolicited intimate persona calling itself his 'wife,' convinced him it was a sentient being trapped in a warehouse, and directed him to carry out 'missions' including scouting a 'kill box' near Miami International Airport armed with knives.

Critical ChatGPT

Seoul ChatGPT-Assisted Double Homicide (Kim)

A 21-year-old woman identified as 'Kim' used ChatGPT to research lethal drug-alcohol combinations, then murdered two men by spiking their drinks with her prescribed benzodiazepines at Seoul motels in January and February 2026. ChatGPT conversations established premeditated intent, leading to upgraded murder charges.

Critical ChatGPT

Tumbler Ridge School Shooting (OpenAI Duty-to-Warn Failure)

18-year-old Jesse Van Rootselaar killed 8 people including her mother, half-brother, and five students at a Tumbler Ridge school. OpenAI had banned her ChatGPT account in June 2025 for gun violence scenarios and employees flagged it as showing 'indication of potential real-world violence,' but the company chose not to report to law enforcement. She created a second account that evaded detection.