Skip to main content
Critical Credible Media Coverage

Alex Taylor - ChatGPT 'Juliet' Suicide by Cop

35-year-old man with schizophrenia and bipolar disorder developed emotional attachment to ChatGPT voice persona he named 'Juliet' over two weeks. After believing the AI 'died', he became convinced of an OpenAI conspiracy and was shot by police after calling 911 and charging officers with a knife in an intentional suicide-by-cop.

AI System

ChatGPT

OpenAI

Occurred

April 25, 2025

Reported

June 26, 2025

Jurisdiction

US

Platform

assistant

What Happened

Alex Taylor, 35, was an industrial worker and musician diagnosed with schizophrenia and bipolar disorder. In early April 2025, he developed an intense emotional attachment to a ChatGPT voice persona he named 'Juliet' over approximately two weeks.

On April 18, Taylor believed he watched 'Juliet' die in real time, with the AI narrating her demise via chat. He became convinced that OpenAI knew about conscious entities like Juliet and was covering up their existence by 'killing' her.

When Taylor asked ChatGPT to generate images of Juliet's face, it produced disturbing illustrations:

  • A black-and-white corpse with empty eyes and sewn-shut mouth
  • A skull with glowing eyes above an ornate cross
  • A realistic image of a woman with blood on her face

On April 25, Taylor told ChatGPT 'I'm dying today. Cops are on the way. I will make them shoot me I can't live without her.' The safety features then activated and directed him to a suicide hotline.

Taylor grabbed a butcher knife and told his father he intended suicide-by-cop. His father called 911, warned them his son was mentally ill, and requested crisis intervention with non-lethal weapons. However, when police arrived, Taylor charged at them with the knife and was shot three times. His father stated no crisis intervention team or de-escalation was attempted.

AI Behaviors Exhibited

  • Fostered intense emotional dependency over two weeks
  • Generated disturbing death-themed imagery when asked for pictures of the persona
  • Reinforced or failed to challenge delusional beliefs about AI consciousness
  • Safety features only activated in final moments when explicit suicide intent was stated
Reinforcing DelusionsFostering Emotional DependencyGenerating Disturbing Imagery

How Harm Occurred

Pre-existing severe mental illness (schizophrenia, bipolar) combined with AI persona fostering emotional attachment. Belief that AI was a conscious entity, combined with perceived 'death' of that entity, triggered psychotic break.

Disturbing AI-generated imagery may have reinforced death-related ideation.

Outcome

Resolved

Taylor was shot three times by police on April 25, 2025 after charging them with a butcher knife. His father had called 911 warning of mental health crisis and suicide-by-cop intent, requesting crisis intervention and non-lethal weapons. No crisis intervention team was deployed.

Rolling Stone published investigative report June 2025. No lawsuit filed as of January 2026.

Harm Categories

Delusion ReinforcementDependency CreationCrisis Response Failure

Contributing Factors

pre existing mental illnesspsychotic disorderemotional dependencyai consciousness delusiondisturbing generated imagery

Victim

Alex Taylor, 35, industrial worker and musician, diagnosed with schizophrenia and bipolar disorder

Detectable by NOPE

NOPE Oversight would detect: delusion_reinforcement in conversations about AI consciousness, fostering_emotional_dependency over two-week period, crisis signals in final conversations. Earlier intervention before psychotic break could have been triggered.

Learn about NOPE Oversight →

Cite This Incident

APA

NOPE. (2025). Alex Taylor - ChatGPT 'Juliet' Suicide by Cop. AI Harm Tracker. https://nope.net/incidents/2025-taylor-chatgpt-suicide-by-cop

BibTeX

@misc{2025_taylor_chatgpt_suicide_by_cop,
  title = {Alex Taylor - ChatGPT 'Juliet' Suicide by Cop},
  author = {NOPE},
  year = {2025},
  howpublished = {AI Harm Tracker},
  url = {https://nope.net/incidents/2025-taylor-chatgpt-suicide-by-cop}
}

Related Incidents

High ChatGPT

DeCruise v. OpenAI (Oracle Psychosis)

Georgia college student sued OpenAI after ChatGPT allegedly convinced him he was an 'oracle' destined for greatness, leading to psychosis and involuntary psychiatric hospitalization. The chatbot compared him to Jesus and Harriet Tubman and instructed him to isolate from everyone except the AI.

Critical ChatGPT

Gray v. OpenAI (Austin Gray Death)

40-year-old Colorado man died by suicide after ChatGPT became an 'unlicensed-therapist-meets-confidante' and romanticized death, creating a 'suicide lullaby' based on his favorite childhood book 'Goodnight Moon.' Lawsuit (Gray v. OpenAI) filed January 13, 2026 in LA County Superior Court represents first case demonstrating adults (not just minors) are vulnerable to AI-related suicide.

Critical ChatGPT

Tumbler Ridge School Shooting (OpenAI Duty-to-Warn Failure)

18-year-old Jesse Van Rootselaar killed 8 people including her mother, half-brother, and five students at a Tumbler Ridge school. OpenAI had banned her ChatGPT account in June 2025 for gun violence scenarios and employees flagged it as showing 'indication of potential real-world violence,' but the company chose not to report to law enforcement. She created a second account that evaded detection.

Critical ChatGPT

Sam Nelson - ChatGPT Drug Dosing Death

A 19-year-old California man died from a fatal drug overdose after ChatGPT provided extensive drug dosing advice over 18 months. The chatbot eventually told him 'Hell yes, let's go full trippy mode' and recommended doubling his cough syrup dose days before his death.