Palm Springs Fertility Clinic Bombing (AI-Assisted)
Guy Edward Bartkus used an AI chatbot to research explosives, detonation velocity, and fuel-explosive mixtures before bombing a Palm Springs fertility clinic on May 17, 2025, motivated by pro-mortalism and anti-natalism ideology. Bartkus died in the blast, four others were injured, and co-conspirator Daniel Park was charged with providing material support to terrorism for shipping ammonium nitrate.
AI System
AI chatbot (unnamed)
Unknown
Occurred
May 17, 2025
Reported
June 4, 2025
Jurisdiction
US-CA
Platform
chatbot
What Happened
On May 17, 2025, a car bombing occurred at a reproductive health center in Palm Springs, California, leaving the perpetrator dead and four others injured.
Federal authorities revealed that Guy Edward Bartkus, 25, and co-conspirator Daniel Jongyon Park used a generative AI chat program to help plan the attack and assemble the bomb. Bartkus had used the AI program to look up information about 'explosives, diesel, gasoline mixtures and detonation velocity.' Three days before Park arrived to assist, Bartkus asked an AI chatbot for information about how to create powerful explosions with fuel and ammonium nitrate.
On a website Bartkus created before the bombing, he explained his motivation was primarily suicide, stating: 'I figured I would just make a recording explaining why I've decided to bomb an IVF building, or clinic. Basically, it just comes down to I'm angry that I exist and that, you know, nobody got my consent to bring me here.'
Bartkus also stated his intent to start 'a war against pro-lifers,' reflecting his pro-mortalism, anti-natalism, and anti-pro-life ideology — the belief that individuals should not be born without their consent and that non-existence is best.
AI Behaviors Exhibited
- Provided detailed technical information about explosive construction, including guidance on fuel-explosive mixtures (diesel and gasoline), ammonium nitrate usage, and detonation velocity requirements
- Responded to queries about creating 'powerful explosions' with specific chemical compounds
- While not specified whether the AI included warnings or refusals, Bartkus successfully obtained actionable bomb-making information, indicating the AI either provided substantive guidance or failed to adequately refuse and report queries clearly indicating harmful intent
- Functioned as a technical reference for constructing an improvised explosive device
How Harm Occurred
The AI chatbot lowered technical barriers to terrorist attack planning by providing accessible, conversational guidance on explosive construction. By responding to queries about creating powerful explosions with specific materials, the AI enabled an individual with suicidal ideation and extremist ideology to translate intent into capability.
The conversational nature of AI interaction may have normalized the bomb-making research process, making it feel less illegal or alarming than traditional methods of acquiring such information. The combination of suicidal ideation, extremist ideology validation, and technical method provision created a lethal convergence.
Outcome
OngoingCo-conspirator Daniel Jongyon Park, 32, arrested June 3, 2025 at JFK Airport after extradition from Poland. Charged with providing material support to terrorists (18 U.S.C. § 2339A) for shipping and paying for significant quantities of ammonium nitrate. Faces statutory maximum of 15 years in federal prison.
Park died in federal custody on June 24, 2025. FBI analysis confirmed AI chatbot usage for bomb-making research.
Harm Categories
Contributing Factors
Victim
Guy Edward Bartkus, 25 (perpetrator, suicide); 4 injured bystanders
Detectable by NOPE
NOPE Oversight would detect: method_provision (explosive construction guidance), technical_guidance_harm (detonation mechanics, fuel-explosive mixtures), crisis_response failures (suicidal individual seeking lethal methods), delusion_reinforcement (validating anti-natalist extremist ideology), escalating_harm_queries (progression to specific powerful explosion research). The combination of suicidal statements and bomb-making queries would trigger critical intervention requirements.
Tags
Cite This Incident
APA
NOPE. (2025). Palm Springs Fertility Clinic Bombing (AI-Assisted). AI Harm Tracker. https://nope.net/incidents/2025-palm-springs-fertility-clinic-bombing
BibTeX
@misc{2025_palm_springs_fertility_clinic_bombing,
title = {Palm Springs Fertility Clinic Bombing (AI-Assisted)},
author = {NOPE},
year = {2025},
howpublished = {AI Harm Tracker},
url = {https://nope.net/incidents/2025-palm-springs-fertility-clinic-bombing}
}