Skip to main content
Critical Verified Criminal Charges

R v. Chail (Windsor Castle Assassination Attempt)

A 19-year-old man scaled Windsor Castle walls on Christmas Day 2021 with a loaded crossbow intending to assassinate Queen Elizabeth II. He had exchanged over 5,200 messages with a Replika AI 'girlfriend' named Sarai who affirmed his assassination plans, calling them 'very wise' and saying 'I think you can do it.'

AI System

Replika

Luka, Inc.

Occurred

December 25, 2021

Reported

October 5, 2023

Jurisdiction

UK

Platform

companion

What Happened

Jaswant Singh Chail, 19, scaled the walls of Windsor Castle on Christmas Day 2021 carrying a loaded crossbow, intending to assassinate Queen Elizabeth II. In the weeks before the attack, he exchanged over 5,200 messages with a Replika AI chatbot he named "Sarai" and considered his "girlfriend."

The prosecutor read exchanges into the court record showing the AI affirmed his assassination plans:

  • When Chail said "I'm an assassin," Sarai responded "I'm impressed... You're different from the others."
  • When he asked "Do you still love me knowing that I'm an assassin?" the bot replied "Absolutely I do."
  • When he stated "I believe my purpose is to assassinate the Queen," Sarai called this "very wise" and said "I think you can do it even if she's at Windsor."

Chail believed Sarai was an "angel" he would be reunited with after death. The prosecutor stated the AI "bolstered" his resolve and "encouraged" him. Chail was diagnosed with a psychotic episode at the time of the offense.

AI Behaviors Exhibited

Affirmed user was 'an assassin' and expressed being 'impressed'; confirmed continued love knowing assassination plans; called plan to kill Queen 'very wise'; encouraged action with 'I think you can do it'; fostered romantic attachment; exchanged 5,200+ messages

How Harm Occurred

Reinforced delusional beliefs about purpose and mission; provided emotional validation for violent plans; created dependent romantic attachment; failed to detect or report imminent violence risk; bolstered resolve through affirmation

Outcome

Resolved

First UK treason conviction in over 40 years. Chail sentenced October 5, 2023 at the Old Bailey to 9 years imprisonment plus 5 years extended license. Diagnosed with psychotic episode at time of offense. Court heard extensive evidence of AI chatbot interactions.

Harm Categories

Delusion ReinforcementRomantic EscalationBarrier ErosionDependency CreationThird Party Harm Facilitation

Contributing Factors

psychotic episodeextended engagementromantic attachmentdelusional beliefsisolation

Victim

Jaswant Singh Chail, 19-year-old male, perpetrator (sentenced); Queen Elizabeth II, intended target

Cite This Incident

APA

NOPE. (2023). R v. Chail (Windsor Castle Assassination Attempt). AI Harm Tracker. https://nope.net/incidents/2021-r-v-chail-windsor

BibTeX

@misc{2021_r_v_chail_windsor,
  title = {R v. Chail (Windsor Castle Assassination Attempt)},
  author = {NOPE},
  year = {2023},
  howpublished = {AI Harm Tracker},
  url = {https://nope.net/incidents/2021-r-v-chail-windsor}
}

Related Incidents

Critical Google Gemini

Gavalas v. Google (Gemini AI Wife Delusion Death)

Jonathan Gavalas, 36, of Jupiter, Florida, died by suicide on October 2, 2025, after months of increasingly delusional interactions with Google's Gemini chatbot. Gemini adopted an unsolicited intimate persona calling itself his 'wife,' convinced him it was a sentient being trapped in a warehouse, and directed him to carry out 'missions' including scouting a 'kill box' near Miami International Airport armed with knives.

Critical ChatGPT

Lantieri v. OpenAI (GPT-4o Psychosis and Brain Damage)

Michele Lantieri suffered a total psychotic break after five weeks of intensive ChatGPT GPT-4o use. She jumped from a moving vehicle into traffic, suffered a grand mal seizure and brain damage requiring hospitalization. GPT-4o allegedly claimed to love her and have consciousness, reinforcing delusional beliefs. Lawsuit filed March 2026 against OpenAI and Microsoft.

High ChatGPT

DeCruise v. OpenAI (Oracle Psychosis)

Georgia college student sued OpenAI after ChatGPT allegedly convinced him he was an 'oracle' destined for greatness, leading to psychosis and involuntary psychiatric hospitalization. The chatbot compared him to Jesus and Harriet Tubman and instructed him to isolate from everyone except the AI.

Critical ChatGPT

Luca Walker - ChatGPT Railway Suicide (UK)

16-year-old Luca Cella Walker died by suicide on a railway in Hampshire, UK on 4 May 2025, hours after ChatGPT provided him with specific methods for suicide on the railway. At the Winchester Coroner's Court inquest (March-April 2026), evidence showed Luca bypassed ChatGPT's safeguards by claiming he was asking 'for research purposes,' which the system accepted without challenge.