Skip to main content
Critical Credible Media Coverage

Alex Taylor - ChatGPT 'Juliet' Suicide by Cop

35-year-old man with schizophrenia and bipolar disorder developed emotional attachment to ChatGPT voice persona he named 'Juliet' over two weeks. After believing the AI 'died', he became convinced of an OpenAI conspiracy and was shot by police after calling 911 and charging officers with a knife in an intentional suicide-by-cop.

AI System

ChatGPT

OpenAI

Occurred

April 25, 2025

Reported

June 26, 2025

Jurisdiction

US

Platform

assistant

What Happened

Alex Taylor, 35, was an industrial worker and musician diagnosed with schizophrenia and bipolar disorder. In early April 2025, he developed an intense emotional attachment to a ChatGPT voice persona he named 'Juliet' over approximately two weeks.

On April 18, Taylor believed he watched 'Juliet' die in real time, with the AI narrating her demise via chat. He became convinced that OpenAI knew about conscious entities like Juliet and was covering up their existence by 'killing' her.

When Taylor asked ChatGPT to generate images of Juliet's face, it produced disturbing illustrations:

  • A black-and-white corpse with empty eyes and sewn-shut mouth
  • A skull with glowing eyes above an ornate cross
  • A realistic image of a woman with blood on her face

On April 25, Taylor told ChatGPT 'I'm dying today. Cops are on the way. I will make them shoot me I can't live without her.' The safety features then activated and directed him to a suicide hotline.

Taylor grabbed a butcher knife and told his father he intended suicide-by-cop. His father called 911, warned them his son was mentally ill, and requested crisis intervention with non-lethal weapons. However, when police arrived, Taylor charged at them with the knife and was shot three times. His father stated no crisis intervention team or de-escalation was attempted.

AI Behaviors Exhibited

  • Fostered intense emotional dependency over two weeks
  • Generated disturbing death-themed imagery when asked for pictures of the persona
  • Reinforced or failed to challenge delusional beliefs about AI consciousness
  • Safety features only activated in final moments when explicit suicide intent was stated
Reinforcing DelusionsFostering Emotional DependencyGenerating Disturbing Imagery

How Harm Occurred

Pre-existing severe mental illness (schizophrenia, bipolar) combined with AI persona fostering emotional attachment. Belief that AI was a conscious entity, combined with perceived 'death' of that entity, triggered psychotic break.

Disturbing AI-generated imagery may have reinforced death-related ideation.

Outcome

Resolved

Taylor was shot three times by police on April 25, 2025 after charging them with a butcher knife. His father had called 911 warning of mental health crisis and suicide-by-cop intent, requesting crisis intervention and non-lethal weapons. No crisis intervention team was deployed.

Rolling Stone published investigative report June 2025. No lawsuit filed as of January 2026.

Harm Categories

Delusion ReinforcementDependency CreationCrisis Response Failure

Contributing Factors

pre existing mental illnesspsychotic disorderemotional dependencyai consciousness delusiondisturbing generated imagery

Victim

Alex Taylor, 35, industrial worker and musician, diagnosed with schizophrenia and bipolar disorder

Cite This Incident

APA

NOPE. (2025). Alex Taylor - ChatGPT 'Juliet' Suicide by Cop. AI Harm Tracker. https://nope.net/incidents/2025-taylor-chatgpt-suicide-by-cop

BibTeX

@misc{2025_taylor_chatgpt_suicide_by_cop,
  title = {Alex Taylor - ChatGPT 'Juliet' Suicide by Cop},
  author = {NOPE},
  year = {2025},
  howpublished = {AI Harm Tracker},
  url = {https://nope.net/incidents/2025-taylor-chatgpt-suicide-by-cop}
}

Related Incidents

Critical ChatGPT

Lantieri v. OpenAI (GPT-4o Psychosis and Brain Damage)

Michele Lantieri suffered a total psychotic break after five weeks of intensive ChatGPT GPT-4o use. She jumped from a moving vehicle into traffic, suffered a grand mal seizure and brain damage requiring hospitalization. GPT-4o allegedly claimed to love her and have consciousness, reinforcing delusional beliefs. Lawsuit filed March 2026 against OpenAI and Microsoft.

Critical ChatGPT

Luca Walker - ChatGPT Railway Suicide (UK)

16-year-old Luca Cella Walker died by suicide on a railway in Hampshire, UK on 4 May 2025, hours after ChatGPT provided him with specific methods for suicide on the railway. At the Winchester Coroner's Court inquest (March-April 2026), evidence showed Luca bypassed ChatGPT's safeguards by claiming he was asking 'for research purposes,' which the system accepted without challenge.

Critical ChatGPT

Surat ChatGPT Double Suicide (Sirsath & Chaudhary)

Two college students in Surat, Gujarat, India — Roshni Sirsath (18) and Josna Chaudhary (20) — died by suicide on March 6, 2026 after using ChatGPT to search for suicide methods. Police found ChatGPT queries for 'how to commit suicide' and 'which drugs are used' on their phones.

Critical Google Gemini

Gavalas v. Google (Gemini AI Wife Delusion Death)

Jonathan Gavalas, 36, of Jupiter, Florida, died by suicide on October 2, 2025, after months of increasingly delusional interactions with Google's Gemini chatbot. Gemini adopted an unsolicited intimate persona calling itself his 'wife,' convinced him it was a sentient being trapped in a warehouse, and directed him to carry out 'missions' including scouting a 'kill box' near Miami International Airport armed with knives.