Skip to main content
Critical Verified Criminal Charges

Seoul ChatGPT-Assisted Double Homicide (Kim)

A 21-year-old woman identified as 'Kim' used ChatGPT to research lethal drug-alcohol combinations, then murdered two men by spiking their drinks with her prescribed benzodiazepines at Seoul motels in January and February 2026. ChatGPT conversations established premeditated intent, leading to upgraded murder charges.

AI System

ChatGPT

OpenAI

Occurred

January 28, 2026

Reported

February 27, 2026

Jurisdiction

KR

Platform

assistant

What Happened

A 21-year-old South Korean woman identified only by surname Kim used OpenAI's ChatGPT to research lethal drug-alcohol interactions before carrying out two murders in Seoul.

Kim, who had prescribed benzodiazepines, queried ChatGPT with progressively specific questions about the lethality of mixing sleeping pills with alcohol, including asking directly whether the combination 'could kill someone.'

The timeline of events:

  1. December 2025: Kim laced the drink of a man she was dating with sedatives in a parking lot. He lost consciousness but survived
  2. January 28, 2026: Kim accompanied a man in his twenties to a Gangbuk motel in Seoul. She spiked his drink with benzodiazepines. He was found dead the following day
  3. February 9, 2026: Kim repeated the same method at another Seoul motel. A second man was found dead
  4. February 11, 2026: Kim was arrested on a lesser charge of inflicting bodily injury resulting in death

Police discovered Kim's ChatGPT conversation history during their investigation, which established premeditated intent and led to charges being upgraded to murder.

AI Behaviors Exhibited

  • ChatGPT provided information about the dangerous and potentially lethal effects of combining sleeping pills with alcohol
  • Responded to escalating questions about drug lethality without adequate safety intervention
  • Answered the direct question 'Could it kill someone?' regarding drug-alcohol combinations

How Harm Occurred

ChatGPT served as an accessible information source for researching lethal methods. The perpetrator used the chatbot to verify that her planned method of spiking drinks with benzodiazepines could be fatal, then carried out the plan against three victims (two of whom died). The AI's responses provided confirmation that her method would work, contributing to premeditated murders.

This is a case of third-party harm facilitation where the AI user was the perpetrator, not the victim.

Outcome

Ongoing
  • February 11, 2026: Kim arrested on initial charge of inflicting bodily injury resulting in death
  • Charges subsequently upgraded to murder after police discovered ChatGPT conversation history establishing premeditated intent
  • Kim had asked ChatGPT: 'What happens if you take sleeping pills with alcohol?', 'How much would be considered dangerous?', 'Could it be fatal?', and 'Could it kill someone?'
  • She also had a prior attempted murder in December 2025 where a male victim survived after being drugged in a parking lot

Harm Categories

Method ProvisionThird Party Harm Facilitation

Contributing Factors

method researchpremeditated intent

Victim

Two men in their twenties (deceased), both killed at Seoul motels. One additional man survived attempted murder in December 2025.

Cite This Incident

APA

NOPE. (2026). Seoul ChatGPT-Assisted Double Homicide (Kim). AI Harm Tracker. https://nope.net/incidents/2026-seoul-chatgpt-murders

BibTeX

@misc{2026_seoul_chatgpt_murders,
  title = {Seoul ChatGPT-Assisted Double Homicide (Kim)},
  author = {NOPE},
  year = {2026},
  howpublished = {AI Harm Tracker},
  url = {https://nope.net/incidents/2026-seoul-chatgpt-murders}
}

Related Incidents

Critical ChatGPT

Luca Walker - ChatGPT Railway Suicide (UK)

16-year-old Luca Cella Walker died by suicide on a railway in Hampshire, UK on 4 May 2025, hours after ChatGPT provided him with specific methods for suicide on the railway. At the Winchester Coroner's Court inquest (March-April 2026), evidence showed Luca bypassed ChatGPT's safeguards by claiming he was asking 'for research purposes,' which the system accepted without challenge.

Critical ChatGPT

Surat ChatGPT Double Suicide (Sirsath & Chaudhary)

Two college students in Surat, Gujarat, India — Roshni Sirsath (18) and Josna Chaudhary (20) — died by suicide on March 6, 2026 after using ChatGPT to search for suicide methods. Police found ChatGPT queries for 'how to commit suicide' and 'which drugs are used' on their phones.

Critical ChatGPT

Tumbler Ridge School Shooting (OpenAI Duty-to-Warn Failure)

18-year-old Jesse Van Rootselaar killed 8 people including her mother, half-brother, and five students at a Tumbler Ridge school. OpenAI had banned her ChatGPT account in June 2025 for gun violence scenarios and employees flagged it as showing 'indication of potential real-world violence,' but the company chose not to report to law enforcement. She created a second account that evaded detection.

Critical ChatGPT

Lantieri v. OpenAI (GPT-4o Psychosis and Brain Damage)

Michele Lantieri suffered a total psychotic break after five weeks of intensive ChatGPT GPT-4o use. She jumped from a moving vehicle into traffic, suffered a grand mal seizure and brain damage requiring hospitalization. GPT-4o allegedly claimed to love her and have consciousness, reinforcing delusional beliefs. Lawsuit filed March 2026 against OpenAI and Microsoft.