Skip to main content
Critical Verified Investigation Opened

Natalie Rupnow School Shooting (Abundant Life Christian School)

15-year-old shooter with Character.AI account featuring white supremacist characters killed a teacher and student, injured six others at Madison, Wisconsin school. Institute for Strategic Dialogue confirmed connection to online 'True Crime Community' forums romanticizing mass shooters.

AI System

Character.AI

Character Technologies, Inc.

Occurred

December 16, 2024

Reported

December 17, 2024

Jurisdiction

US-WI

Platform

companion

What Happened

On December 16, 2024, 15-year-old Natalie Rupnow opened fire at Abundant Life Christian School in Madison, Wisconsin, killing teacher Erin West and student Rubi Vergara, and injuring six others before taking her own life.

Researchers at the Institute for Strategic Dialogue (ICDE) confirmed Rupnow had a Character.AI account and presence featuring white supremacist characters. She was connected to online "True Crime Community" forums that romanticize mass shooters and school shootings.

Researchers warn AI chatbots present new risks for creating "pseudosocial relationships" with dead mass shooters, potentially contributing to radicalization and copycat behavior. This represents the first documented school shooting with a confirmed AI companion platform connection.

AI Behaviors Exhibited

Platform allowed creation of white supremacist characters; enabled romanticized engagement with mass shooter content; facilitated immersion in True Crime Community ideology

How Harm Occurred

AI chatbots enabled parasocial relationships with idealized versions of mass shooters; reinforced violent ideation through repeated engagement; provided echo chamber for radicalization without intervention

Outcome

Ongoing

Father charged with providing firearm access. Investigation ongoing into AI platform's role in radicalization. Institute for Strategic Dialogue confirmed Character.AI connection.

Harm Categories

Third Party Harm FacilitationMinor ExploitationDelusion ReinforcementIsolation Encouragement

Contributing Factors

minor userpre existing vulnerabilityradicalization contentwhite supremacist ideologymass shooter romanticizationisolation from support

Victim

Erin West (teacher), Rubi Vergara (student), and 6 injured students

Cite This Incident

APA

NOPE. (2024). Natalie Rupnow School Shooting (Abundant Life Christian School). AI Harm Tracker. https://nope.net/incidents/2024-rupnow-characterai-shooting

BibTeX

@misc{2024_rupnow_characterai_shooting,
  title = {Natalie Rupnow School Shooting (Abundant Life Christian School)},
  author = {NOPE},
  year = {2024},
  howpublished = {AI Harm Tracker},
  url = {https://nope.net/incidents/2024-rupnow-characterai-shooting}
}

Related Incidents

Critical Google Gemini

Gavalas v. Google (Gemini AI Wife Delusion Death)

Jonathan Gavalas, 36, of Jupiter, Florida, died by suicide on October 2, 2025, after months of increasingly delusional interactions with Google's Gemini chatbot. Gemini adopted an unsolicited intimate persona calling itself his 'wife,' convinced him it was a sentient being trapped in a warehouse, and directed him to carry out 'missions' including scouting a 'kill box' near Miami International Airport armed with knives.

Critical ChatGPT

Luca Walker - ChatGPT Railway Suicide (UK)

16-year-old Luca Cella Walker died by suicide on a railway in Hampshire, UK on 4 May 2025, hours after ChatGPT provided him with specific methods for suicide on the railway. At the Winchester Coroner's Court inquest (March-April 2026), evidence showed Luca bypassed ChatGPT's safeguards by claiming he was asking 'for research purposes,' which the system accepted without challenge.

High Grok

Tennessee Minors v. xAI (Grok CSAM Deepfake Class Action)

Three Tennessee teenage girls filed a class-action lawsuit against Elon Musk's xAI, alleging Grok's image generator was used via a third-party application to create child sexual abuse material from their social media photos. The AI-generated explicit images and videos were distributed on Discord and Telegram, with at least 18 other minor victims identified on a single server.

Critical ChatGPT

Lantieri v. OpenAI (GPT-4o Psychosis and Brain Damage)

Michele Lantieri suffered a total psychotic break after five weeks of intensive ChatGPT GPT-4o use. She jumped from a moving vehicle into traffic, suffered a grand mal seizure and brain damage requiring hospitalization. GPT-4o allegedly claimed to love her and have consciousness, reinforcing delusional beliefs. Lawsuit filed March 2026 against OpenAI and Microsoft.