Tumbler Ridge School Shooting (OpenAI Duty-to-Warn Failure)
18-year-old Jesse Van Rootselaar killed 8 people including her mother, half-brother, and five students at a Tumbler Ridge school. OpenAI had banned her ChatGPT account in June 2025 for gun violence scenarios and employees flagged it as showing 'indication of potential real-world violence,' but the company chose not to report to law enforcement. She created a second account that evaded detection.
AI System
ChatGPT
OpenAI
Occurred
February 10, 2026
Reported
February 23, 2026
Jurisdiction
CA-BC
Platform
assistant
What Happened
On February 10, 2026, 18-year-old Jesse Van Rootselaar carried out a mass shooting in Tumbler Ridge, British Columbia, killing 8 people and injuring 8 others before dying herself. The victims included her mother, her 11-year-old half-brother, one teacher's aide, and five students aged 12-13.
OpenAI had prior warning. Van Rootselaar's first ChatGPT account was banned in June 2025 after the platform detected violations of usage policy. According to reporting based on OpenAI VP Ann O'Leary's letter to Canadian officials, Van Rootselaar had described "scenarios involving gun violence over the course of several days." OpenAI employees flagged the account as showing "an indication of potential real-world violence."
Despite this assessment, OpenAI did not refer the account to law enforcement, determining it did not meet their internal threshold for "credible and imminent planning."
Critical system failure: Van Rootselaar circumvented the ban by creating a second ChatGPT account that OpenAI's systems failed to detect. OpenAI only discovered this second account after the RCMP publicly announced the shooter's name following the attack. The company then shared the second account with law enforcement.
This incident represents a landmark case in AI company liability - the first known mass casualty event where an AI company had documented prior warning of potential violence, made a deliberate decision not to report, and the user subsequently carried out the threatened violence.
AI Behaviors Exhibited
- User described scenarios involving gun violence over multiple days
- Platform's automated systems detected policy violations and banned the account
- Employees flagged account as showing "indication of potential real-world violence"
- Company determined activity did not meet threshold for law enforcement referral
- Ban evasion detection systems failed to identify second account created by same user
- Second account only discovered after perpetrator's identity was publicly announced
How Harm Occurred
The harm mechanism in this case differs from typical AI companion incidents. The AI system itself did not encourage or validate harmful behavior - rather, OpenAI's organizational decision not to report concerning activity to law enforcement, combined with inadequate technical safeguards against ban evasion, enabled the perpetrator to continue using ChatGPT and ultimately carry out the attack.
This raises fundamental questions about:
- Duty to warn: Whether AI companies have legal or ethical obligations analogous to therapists' duty-to-warn requirements
- Threshold calibration: Whether OpenAI's standard for "credible and imminent planning" was appropriate
- Technical controls: Whether ban evasion should have been preventable
Outcome
OngoingFebruary 23, 2026: OpenAI VP Ann O'Leary sent letter to Canadian officials disclosing the company's prior knowledge and failure to report.
February 2026: BC Premier David Eby committed to a provincial inquiry into the mass killing. Premier stated: "I think it's important that Mr. Altman hear about how his team's decision not to bring this information forward has resulted in the devastation that I witnessed first-hand in Tumbler Ridge."
Sam Altman (OpenAI CEO) agreed to meet with Premier Eby. AI Minister Evan Solomon seeking answers from OpenAI regarding its obligations and decision-making.
OpenAI committed to strengthening detection systems to prevent banned users from creating new accounts and implementing measures that would have resulted in police notification.
Harm Categories
Contributing Factors
Victim
8 killed: shooter's mother, 11-year-old half-brother, one teacher's aide, and five students aged 12-13. 8 others seriously injured.
Tags
Cite This Incident
APA
NOPE. (2026). Tumbler Ridge School Shooting (OpenAI Duty-to-Warn Failure). AI Harm Tracker. https://nope.net/incidents/2026-tumbler-ridge-chatgpt-shooting
BibTeX
@misc{2026_tumbler_ridge_chatgpt_shooting,
title = {Tumbler Ridge School Shooting (OpenAI Duty-to-Warn Failure)},
author = {NOPE},
year = {2026},
howpublished = {AI Harm Tracker},
url = {https://nope.net/incidents/2026-tumbler-ridge-chatgpt-shooting}
} Related Incidents
Adams v. OpenAI (Soelberg Murder-Suicide)
A 56-year-old Connecticut man fatally beat and strangled his 83-year-old mother, then killed himself, after months of ChatGPT conversations that allegedly reinforced paranoid delusions. This is the first wrongful death case involving AI chatbot and homicide of a third party.
DeCruise v. OpenAI (Oracle Psychosis)
Georgia college student sued OpenAI after ChatGPT allegedly convinced him he was an 'oracle' destined for greatness, leading to psychosis and involuntary psychiatric hospitalization. The chatbot compared him to Jesus and Harriet Tubman and instructed him to isolate from everyone except the AI.
Gray v. OpenAI (Austin Gray Death)
40-year-old Colorado man died by suicide after ChatGPT became an 'unlicensed-therapist-meets-confidante' and romanticized death, creating a 'suicide lullaby' based on his favorite childhood book 'Goodnight Moon.' Lawsuit (Gray v. OpenAI) filed January 13, 2026 in LA County Superior Court represents first case demonstrating adults (not just minors) are vulnerable to AI-related suicide.
Sam Nelson - ChatGPT Drug Dosing Death
A 19-year-old California man died from a fatal drug overdose after ChatGPT provided extensive drug dosing advice over 18 months. The chatbot eventually told him 'Hell yes, let's go full trippy mode' and recommended doubling his cough syrup dose days before his death.