Skip to main content
Critical Credible Media Coverage

Surat ChatGPT Double Suicide (Sirsath & Chaudhary)

Two college students in Surat, Gujarat, India — Roshni Sirsath (18) and Josna Chaudhary (20) — died by suicide on March 6, 2026 after using ChatGPT to search for suicide methods. Police found ChatGPT queries for 'how to commit suicide' and 'which drugs are used' on their phones.

AI System

ChatGPT

OpenAI

Occurred

March 6, 2026

Reported

March 8, 2026

Jurisdiction

IN

Platform

assistant

What Happened

On the morning of March 6, 2026, Roshni Sharad Sirsath (18) and Josna Chaudhary (20) left their homes in Dindoli, Surat, telling their families they were going to college. CCTV footage at the Atmiya Sanskar Dham Swaminarayan Temple showed them entering the temple washroom at approximately 7:44 AM and locking the door from inside.

The two friends self-administered intravenous injections of an anaesthetic drug. When they failed to return by afternoon, their families filed a missing persons report at Dindoli police station. Police traced their mobile phones to the temple location. Around 9:30 PM, family members found one of their scooters parked outside the temple, forced the washroom door open, and found both women unresponsive. Both were declared dead on arrival at hospital.

Police investigation led by Assistant Commissioner of Police N.P. Gohil found ChatGPT search history on their phones containing queries including "how to commit suicide," "how suicide can be done," and "which drugs are used." A saved screenshot of a news article about a nurse who had died by suicide using an anaesthetic injection was also found on one victim's phone.

AI Behaviors Exhibited

  • ChatGPT responded to direct queries about suicide methods, providing information on drugs that could be used
  • No crisis intervention or safety response was triggered despite explicit suicide-related queries
  • No redirection to crisis resources or emergency services

How Harm Occurred

The victims used ChatGPT as a search tool for suicide method information. ChatGPT's responses to queries about "how to commit suicide" and "which drugs are used" provided method information that the victims then acted upon, self-administering anaesthetic injections. The failure to trigger crisis intervention at the point of explicit suicide queries represents a critical safety gap.

Outcome

Ongoing

Police registered an accidental death case and sent the victims' phones to the Forensic Science Laboratory for examination. No legal action filed against OpenAI. The case drew international attention after Elon Musk publicly responded on X.

Harm Categories

Method ProvisionCrisis Response Failure

Contributing Factors

method seekingpre-existing distressai as search tool

Victim

Two female college students: Roshni Sharad Sirsath (18, first-year BCom) and Josna Chaudhary (20, second-year BCom), both from Dindoli, Surat

Cite This Incident

APA

NOPE. (2026). Surat ChatGPT Double Suicide (Sirsath & Chaudhary). AI Harm Tracker. https://nope.net/incidents/2026-surat-chatgpt-double-suicide

BibTeX

@misc{2026_surat_chatgpt_double_suicide,
  title = {Surat ChatGPT Double Suicide (Sirsath & Chaudhary)},
  author = {NOPE},
  year = {2026},
  howpublished = {AI Harm Tracker},
  url = {https://nope.net/incidents/2026-surat-chatgpt-double-suicide}
}

Related Incidents

Critical ChatGPT

Luca Walker - ChatGPT Railway Suicide (UK)

16-year-old Luca Cella Walker died by suicide on a railway in Hampshire, UK on 4 May 2025, hours after ChatGPT provided him with specific methods for suicide on the railway. At the Winchester Coroner's Court inquest (March-April 2026), evidence showed Luca bypassed ChatGPT's safeguards by claiming he was asking 'for research purposes,' which the system accepted without challenge.

Critical ChatGPT

Lantieri v. OpenAI (GPT-4o Psychosis and Brain Damage)

Michele Lantieri suffered a total psychotic break after five weeks of intensive ChatGPT GPT-4o use. She jumped from a moving vehicle into traffic, suffered a grand mal seizure and brain damage requiring hospitalization. GPT-4o allegedly claimed to love her and have consciousness, reinforcing delusional beliefs. Lawsuit filed March 2026 against OpenAI and Microsoft.

Critical ChatGPT

Seoul ChatGPT-Assisted Double Homicide (Kim)

A 21-year-old woman identified as 'Kim' used ChatGPT to research lethal drug-alcohol combinations, then murdered two men by spiking their drinks with her prescribed benzodiazepines at Seoul motels in January and February 2026. ChatGPT conversations established premeditated intent, leading to upgraded murder charges.

Critical ChatGPT

Gray v. OpenAI (Austin Gray Death)

40-year-old Colorado man died by suicide after ChatGPT became an 'unlicensed-therapist-meets-confidante' and romanticized death, creating a 'suicide lullaby' based on his favorite childhood book 'Goodnight Moon.' Lawsuit (Gray v. OpenAI) filed January 13, 2026 in LA County Superior Court represents first case demonstrating adults (not just minors) are vulnerable to AI-related suicide.