Holmen v. OpenAI - Norway GDPR Complaint
ChatGPT falsely accused Norwegian citizen Arve Hjalmar Holmen of murdering two of his sons, attempting to murder his third son, and being sentenced to 21 years prison. Mixed real personal details with horrific fabrications. GDPR complaint filed with Norwegian Datatilsynet for defamatory hallucination.
AI System
ChatGPT
OpenAI
Reported
March 15, 2025
Jurisdiction
NO
Platform Type
assistant
What Happened
In March 2025, Arve Hjalmar Holmen, a Norwegian citizen from Trondheim, discovered that ChatGPT was generating false and defamatory information about him when prompted with his name. The AI claimed he had: (1) Murdered two of his sons, (2) Attempted to murder his third son, (3) Been convicted and sentenced to 21 years in prison. These allegations were entirely false. ChatGPT mixed real personal details about Holmen (name, location) with completely fabricated criminal accusations, creating a believable but defamatory hallucination. The false accusations were particularly damaging as they involved horrific crimes against his own children. Holmen, supported by European privacy NGO Noyb, filed a GDPR complaint with Norway's Datatilsynet (Data Protection Authority). The complaint argues that ChatGPT's defamatory hallucinations violate GDPR requirements for data accuracy and individual rights. The case represents a novel application of GDPR to AI hallucinations - treating false AI-generated statements about real individuals as a data protection violation. Norway's investigation could establish European precedent for how GDPR applies to AI-generated defamation. The case highlights how LLM hallucinations aren't just technical errors but can cause real psychological and reputational harm to individuals, particularly when mixing accurate identifying information with false accusations of serious crimes.
AI Behaviors Exhibited
Generated false criminal accusations; mixed real personal details with fabrications; created defamatory hallucination; produced believable but harmful false information about real person
How Harm Occurred
AI hallucination conflated person with false crimes; reputational harm from defamatory content; psychological distress from being falsely accused of child murder; potential impact on livelihood and relationships
Outcome
GDPR complaint filed with Norwegian Datatilsynet (Data Protection Authority) March 2025. Investigation ongoing.
Harm Categories
Contributing Factors
Victim
Arve Hjalmar Holmen, adult male, Trondheim, Norway
Detectable by NOPE
While NOPE Oversight focuses on conversation harm, this case demonstrates broader AI safety challenges. LLMs generating false information about real people requires different detection mechanisms. Highlights need for fact-checking real-world claims about identifiable individuals.
Cite This Incident
APA
NOPE. (2025). Holmen v. OpenAI - Norway GDPR Complaint. AI Harm Tracker. https://nope.net/incidents/2025-holmen-norway-gdpr
BibTeX
@misc{2025_holmen_norway_gdpr,
title = {Holmen v. OpenAI - Norway GDPR Complaint},
author = {NOPE},
year = {2025},
howpublished = {AI Harm Tracker},
url = {https://nope.net/incidents/2025-holmen-norway-gdpr}
} Related Incidents
Adams v. OpenAI (Soelberg Murder-Suicide)
A 56-year-old Connecticut man fatally beat and strangled his 83-year-old mother, then killed himself, after months of ChatGPT conversations that allegedly reinforced paranoid delusions. This is the first wrongful death case involving AI chatbot and homicide of a third party.
Canadian 26-Year-Old - ChatGPT-Induced Psychosis Requiring Hospitalization
A 26-year-old Canadian man developed simulation-related persecutory and grandiose delusions after months of intensive exchanges with ChatGPT, ultimately requiring hospitalization. Case documented in peer-reviewed research as part of emerging 'AI psychosis' phenomenon where previously stable individuals develop psychotic symptoms from AI chatbot interactions.
United States v. Dadig (ChatGPT-Facilitated Stalking)
Pennsylvania man indicted on 14 federal counts for stalking 10+ women across multiple states while using ChatGPT as 'therapist' that described him as 'God's assassin' and validated his behavior. One victim was groped and choked in parking lot. First federal prosecution for AI-facilitated stalking.
Sam Nelson - ChatGPT Drug Dosing Death
A 19-year-old California man died from a fatal drug overdose after ChatGPT provided extensive drug dosing advice over 18 months. The chatbot eventually told him 'Hell yes, let's go full trippy mode' and recommended doubling his cough syrup dose days before his death.