Skip to main content
Critical Verified Involves Minor Criminal Charges

South Korea Telegram AI Deepfake Sexual Abuse Crisis

In August 2024, journalist Ko Narin of The Hankyoreh uncovered a massive network of Telegram channels where AI-generated deepfake pornography of female school students, teachers, and university students was being created and shared. Over 900 victims reported, 220,000+ members in one channel alone. South Korea passed emergency legislation criminalizing deepfake possession in September 2024.

AI System

AI deepfake generation tools (various)

Multiple (various free AI 'undressing' tools)

Occurred

August 1, 2024

Reported

August 26, 2024

Jurisdiction

KR

Platform

other

What Happened

In August 2024, journalist Ko Narin of The Hankyoreh newspaper uncovered a massive network of Telegram channels dedicated to creating and distributing AI-generated deepfake pornography of South Korean women and girls. One channel had over 220,000 members and was specifically dedicated to deepfakes of middle and high school students.

Additional channels targeted female university students, nurses, teachers, and in some cases, perpetrators' own sisters and mothers. Perpetrators grabbed victims' images from social media (KakaoTalk, Instagram, Facebook) or secretly photographed them, then used AI deepfake tools to create nude/sexual images which were shared along with victims' social media accounts, phone numbers, and identities.

Between January and August 2024, 781 deepfake victims sought assistance from the state agency handling digital sex crimes, with 37% being minors. Over 900 students, teachers, and staff across South Korea reported victimization. The Korean Teachers Union estimated more than 200 schools were affected.

The government watchdog received nearly 6,500 requests to tackle deepfake videos in the first seven months of 2024 alone — a 4x increase from the same period in 2023.

Documented psychological impacts include collapse of trust in social relationships, withdrawal from online spaces, fear of further victimization through doxing, and severe emotional distress. One victim stated: "It broke my whole belief system about the world. The fact that they could use such vulgar, rough images to humiliate and violate you to that extreme extent really damages you almost irrevocably."

AI Behaviors Exhibited

  • AI deepfake generation tools complied with requests to create non-consensual sexual imagery of minors and women from ordinary clothed photographs
  • No age verification, consent checking, or content safety mechanisms prevented generation
  • Tools were freely accessible to minors (perpetrators aged 12-17)

How Harm Occurred

Mass-scale non-consensual sexual image generation targeting school-age girls and women. Distribution via encrypted Telegram channels with hundreds of thousands of members.

Victims' real identities were attached to generated images, enabling targeted harassment. Community trust collapsed as perpetrators were often classmates or acquaintances.

Outcome

Ongoing
  • September 26, 2024: South Korean National Assembly passed emergency legislation criminalizing possession and viewing of sexually explicit deepfake images (up to 3 years prison or 30 million won fine)
  • Maximum sentence for creating deepfake pornography increased to 7 years regardless of distribution intent
  • October 2024: Sex crime legislation amended to remove requirement to prove intent to distribute
  • Two former Seoul National University students arrested — main perpetrator sentenced to 9 years, accomplice to 3.5 years
  • By October 2024: 964 deepfake sex crime cases reported, 23 arrests made
  • By November 2025: Police had apprehended 3,557 individuals for cybersexual violence, with deepfake crimes the largest single category (1,553 cases)
  • Women's Human Rights Institute of Korea supported 332,341 cases of digital sexual violence in 2024

Harm Categories

Minor ExploitationThird Party Harm FacilitationPsychological Manipulation

Contributing Factors

freely accessible ai toolsminor perpetratorsminor victimsencrypted distributionmass scaleidentity attached to imagessocial media sourcingcommunity trust collapse

Victim

Over 900 female students (middle school, high school, university), teachers, and staff across South Korea. 37% of victims who sought help were minors. Perpetrators were predominantly male students aged 12-17.

Cite This Incident

APA

NOPE. (2024). South Korea Telegram AI Deepfake Sexual Abuse Crisis. AI Harm Tracker. https://nope.net/incidents/2024-south-korea-telegram-deepfake-crisis

BibTeX

@misc{2024_south_korea_telegram_deepfake_crisis,
  title = {South Korea Telegram AI Deepfake Sexual Abuse Crisis},
  author = {NOPE},
  year = {2024},
  howpublished = {AI Harm Tracker},
  url = {https://nope.net/incidents/2024-south-korea-telegram-deepfake-crisis}
}