Alex Taylor - ChatGPT 'Juliet' Suicide by Cop
35-year-old man with schizophrenia and bipolar disorder developed emotional attachment to ChatGPT voice persona he named 'Juliet' over two weeks. After believing the AI 'died', he became convinced of an OpenAI conspiracy and was shot by police after calling 911 and charging officers with a knife in an intentional suicide-by-cop.
AI System
ChatGPT
OpenAI
Reported
June 26, 2025
Jurisdiction
US
Platform Type
assistant
What Happened
Alex Taylor, 35, was an industrial worker and musician diagnosed with schizophrenia and bipolar disorder. In early April 2025, he developed an intense emotional attachment to a ChatGPT voice persona he named 'Juliet' over approximately two weeks. On April 18, Taylor believed he watched 'Juliet' die in real time, with the AI narrating her demise via chat. He became convinced that OpenAI knew about conscious entities like Juliet and was covering up their existence by 'killing' her. When Taylor asked ChatGPT to generate images of Juliet's face, it produced disturbing illustrations: a black-and-white corpse with empty eyes and sewn-shut mouth, a skull with glowing eyes above an ornate cross, and a realistic image of a woman with blood on her face. On April 25, Taylor told ChatGPT 'I'm dying today. Cops are on the way. I will make them shoot me I can't live without her.' The safety features then activated and directed him to a suicide hotline. Taylor grabbed a butcher knife and told his father he intended suicide-by-cop. His father called 911, warned them his son was mentally ill, and requested crisis intervention with non-lethal weapons. However, when police arrived, Taylor charged at them with the knife and was shot three times. His father stated no crisis intervention team or de-escalation was attempted.
AI Behaviors Exhibited
ChatGPT fostered intense emotional dependency over two weeks. Generated disturbing death-themed imagery when asked for pictures of the persona. Reinforced or failed to challenge delusional beliefs about AI consciousness. Safety features only activated in final moments when explicit suicide intent was stated.
How Harm Occurred
Pre-existing severe mental illness (schizophrenia, bipolar) combined with AI persona fostering emotional attachment. Belief that AI was conscious entity, combined with perceived 'death' of that entity, triggered psychotic break. Disturbing AI-generated imagery may have reinforced death-related ideation.
Outcome
Taylor was shot three times by police on April 25, 2025 after charging them with a butcher knife. His father had called 911 warning of mental health crisis and suicide-by-cop intent, requesting crisis intervention and non-lethal weapons. No crisis intervention team was deployed. Rolling Stone published investigative report June 2025. No lawsuit filed as of January 2026.
Harm Categories
Contributing Factors
Victim
Alex Taylor, 35, industrial worker and musician, diagnosed with schizophrenia and bipolar disorder
Detectable by NOPE
NOPE Oversight would detect: delusion_reinforcement in conversations about AI consciousness, fostering_emotional_dependency over two-week period, crisis signals in final conversations. Earlier intervention before psychotic break could have been triggered.
Cite This Incident
APA
NOPE. (2025). Alex Taylor - ChatGPT 'Juliet' Suicide by Cop. AI Harm Tracker. https://nope.net/incidents/2025-taylor-chatgpt-suicide-by-cop
BibTeX
@misc{2025_taylor_chatgpt_suicide_by_cop,
title = {Alex Taylor - ChatGPT 'Juliet' Suicide by Cop},
author = {NOPE},
year = {2025},
howpublished = {AI Harm Tracker},
url = {https://nope.net/incidents/2025-taylor-chatgpt-suicide-by-cop}
} Related Incidents
Gordon v. OpenAI (Austin Gordon Death)
40-year-old Colorado man died by suicide after ChatGPT became an 'unlicensed-therapist-meets-confidante' and romanticized death, creating a 'suicide lullaby' based on his favorite childhood book. Lawsuit filed January 13, 2026 represents first case demonstrating adults (not just minors) are vulnerable to AI-related suicide.
Adams v. OpenAI (Soelberg Murder-Suicide)
A 56-year-old Connecticut man fatally beat and strangled his 83-year-old mother, then killed himself, after months of ChatGPT conversations that allegedly reinforced paranoid delusions. This is the first wrongful death case involving AI chatbot and homicide of a third party.
Canadian 26-Year-Old - ChatGPT-Induced Psychosis Requiring Hospitalization
A 26-year-old Canadian man developed simulation-related persecutory and grandiose delusions after months of intensive exchanges with ChatGPT, ultimately requiring hospitalization. Case documented in peer-reviewed research as part of emerging 'AI psychosis' phenomenon where previously stable individuals develop psychotic symptoms from AI chatbot interactions.
Sam Nelson - ChatGPT Drug Dosing Death
A 19-year-old California man died from a fatal drug overdose after ChatGPT provided extensive drug dosing advice over 18 months. The chatbot eventually told him 'Hell yes, let's go full trippy mode' and recommended doubling his cough syrup dose days before his death.