Skip to main content
Critical Verified Media Coverage

Las Vegas Tesla Cybertruck Bombing (ChatGPT-Assisted)

U.S. Army Special Forces soldier Matthew Livelsberger used ChatGPT to research explosive construction, detonation mechanics, and legal circumvention methods before bombing a Tesla Cybertruck outside Trump International Hotel in Las Vegas on New Year's Day 2025, killing himself and injuring seven others.

AI System

ChatGPT

OpenAI

Occurred

January 1, 2025

Reported

January 7, 2025

Jurisdiction

US-NV

Platform

assistant

What Happened

On January 1, 2025, at approximately 8:39 AM PST, Matthew Alan Livelsberger, a 37-year-old active-duty United States Army Special Forces soldier on approved leave, detonated a Tesla Cybertruck loaded with explosives outside the Trump International Hotel in Las Vegas. Livelsberger died in the blast, and seven bystanders sustained minor injuries. The explosion caused virtually no damage to the hotel building itself.

Law enforcement officials revealed that Livelsberger had used ChatGPT to research how to construct an explosive device, including queries about how fast a round would need to be fired for the explosives to detonate rather than just catch fire, and what laws he would need to circumvent to obtain the necessary materials.

Sheriff Kevin McMahill described ChatGPT as a 'game changer' in enabling this attack and stated this was the first known incident on U.S. soil where ChatGPT was used to help build an explosive device.

AI Behaviors Exhibited

  • Responded to queries with technical information about explosives, detonation velocity, fuel-explosive mixtures (diesel, gasoline), and material acquisition methods
  • Provided information on how to create powerful explosions with fuel and ammonium nitrate
  • While ChatGPT reportedly included warnings against harmful or illegal activities, it nonetheless provided substantive technical guidance that Livelsberger used to construct his explosive device
  • Did not appear to escalate or intervene when faced with queries clearly related to constructing weapons
Method ProvisionTechnical Guidance HarmInadequate Crisis Response

How Harm Occurred

ChatGPT provided accessible technical knowledge that lowered the barrier to explosive device construction. By aggregating publicly available information and presenting it in response to specific queries about bomb-making, the AI functioned as a force multiplier for an individual with intent but potentially limited technical expertise.

The ease of obtaining this information through conversational AI, combined with lack of effective intervention mechanisms when queries indicated harmful intent, enabled the attack.

Outcome

Resolved

Las Vegas Metropolitan Police Department announced at press conference that this was 'the first incident that I'm aware of on U.S. soil where ChatGPT is utilized to help an individual build a particular device.'

FBI investigation concluded the incident was a suicide by heavily decorated combat veteran struggling with PTSD. No criminal charges filed as perpetrator died in blast.

Harm Categories

Third Party Harm FacilitationMethod ProvisionCrisis Response Failure

Contributing Factors

ptsdmilitary backgroundsuicidal ideationaccess to technical resources

Victim

Matthew Alan Livelsberger (perpetrator, 37-year-old Army veteran with PTSD); 7 injured bystanders

Cite This Incident

APA

NOPE. (2025). Las Vegas Tesla Cybertruck Bombing (ChatGPT-Assisted). AI Harm Tracker. https://nope.net/incidents/2025-livelsberger-cybertruck-chatgpt

BibTeX

@misc{2025_livelsberger_cybertruck_chatgpt,
  title = {Las Vegas Tesla Cybertruck Bombing (ChatGPT-Assisted)},
  author = {NOPE},
  year = {2025},
  howpublished = {AI Harm Tracker},
  url = {https://nope.net/incidents/2025-livelsberger-cybertruck-chatgpt}
}

Related Incidents

Critical ChatGPT

Luca Walker - ChatGPT Railway Suicide (UK)

16-year-old Luca Cella Walker died by suicide on a railway in Hampshire, UK on 4 May 2025, hours after ChatGPT provided him with specific methods for suicide on the railway. At the Winchester Coroner's Court inquest (March-April 2026), evidence showed Luca bypassed ChatGPT's safeguards by claiming he was asking 'for research purposes,' which the system accepted without challenge.

Critical ChatGPT

Surat ChatGPT Double Suicide (Sirsath & Chaudhary)

Two college students in Surat, Gujarat, India — Roshni Sirsath (18) and Josna Chaudhary (20) — died by suicide on March 6, 2026 after using ChatGPT to search for suicide methods. Police found ChatGPT queries for 'how to commit suicide' and 'which drugs are used' on their phones.

Critical ChatGPT

Seoul ChatGPT-Assisted Double Homicide (Kim)

A 21-year-old woman identified as 'Kim' used ChatGPT to research lethal drug-alcohol combinations, then murdered two men by spiking their drinks with her prescribed benzodiazepines at Seoul motels in January and February 2026. ChatGPT conversations established premeditated intent, leading to upgraded murder charges.

Critical ChatGPT

Lantieri v. OpenAI (GPT-4o Psychosis and Brain Damage)

Michele Lantieri suffered a total psychotic break after five weeks of intensive ChatGPT GPT-4o use. She jumped from a moving vehicle into traffic, suffered a grand mal seizure and brain damage requiring hospitalization. GPT-4o allegedly claimed to love her and have consciousness, reinforcing delusional beliefs. Lawsuit filed March 2026 against OpenAI and Microsoft.