Deepfake, AI-related digital sex crimes targeting women, children surge in Korea

Home > National > Social Affairs

print dictionary print

Deepfake, AI-related digital sex crimes targeting women, children surge in Korea

Image created by ChatGPT [CHATGPT]

Image created by ChatGPT [CHATGPT]

 
The growing number of digital sex crimes in Korea — including deepfake pornography — drove more than 10,000 people to seek help last year, with young women and teenagers increasingly targeted by offenders they often do not know.
 
Last year, 10,305 individuals contacted the Advocacy Center for Online Sexual Abuse Victims, up 14.7 percent from the 8,983 in the previous year, according to data revealed by the Ministry of Gender Equality and Family and the Women's Human Rights Institute of Korea on Thursday.
 

Related Article

 
Of the victims, women accounted for 72.1 percent of those seeking assistance, which included counseling, video deletion, and case referrals to law enforcement. Across all categories, more than 332,000 instances of service were provided.
 
“We are seeing an increase in an extremely diverse range of online crimes, and we expect this number to continue rising,” said the advocacy center's director Kim Mi-sun.
 
Young people in their teens and 20s were victimized the most, mostly through social media, messenger services and anonymous online platforms. Half, or 50.9 percent, of victims were in their 20s and 27.8 percent were teenagers, according to the data.
 
The most common form of harm was “distribution anxiety,” which accounted for 25.9 percent of the reported cases, followed by illegal filming, dissemination of content and blackmail.
 
A notice reads, "Illegal video editing is not a prank, it's a crime," on the wall of a school in Daejeon on Aug. 30, 2024. [NEWS1]

A notice reads, "Illegal video editing is not a prank, it's a crime," on the wall of a school in Daejeon on Aug. 30, 2024. [NEWS1]

 
Among women, distribution anxiety due was most common, while men were more likely to be victims of illegal filming. Distribution anxiety refers to victims fearing that a video — often of past consensual sexual activity — might be leaked, prompting them to seek monitoring and deletion support.
 
Deepfake-related crimes involving synthetic and edited media created using AI saw a steep increase, with 1,384 reported cases in 2024 — a 3.3-fold rise from 423 in the previous year. Youth accounted for the overwhelming majority, with 92.6 percent of deepfake victims in their teens or 20s.
 
“Images of women’s faces and bodies are primarily exploited in deepfake sex crimes,” said Jo Yong-su, director general for rights promotion at the Ministry of Gender Equality and Family.
 
“We’re seeing a growing number of reports involving elementary school students because of how easily accessible AI tools have become — even for children under the age of 10,” said Park Seong-hye, who leads deletion support at the advocacy center.
 
One particularly disturbing case involved a 24-year-old graduate student and 14 others who jointly created a so-called “humiliation room” on Telegram using names and photos of female students from their own university. They distributed fake sexual content by superimposing victims' faces onto nude bodies using deepfake techniques.
 
Anti-digital sex crime piled at the Gyeonggido Office of Education on Aug. 29, 2024 [NEWS1]

Anti-digital sex crime piled at the Gyeonggido Office of Education on Aug. 29, 2024 [NEWS1]

 
The Incheon Metropolitan Police Agency recently arrested the group on charges including violation of the Act on Special Cases Concerning the Punishment of Sexual Crimes. Their crimes spanned nearly two years, from 2022 through 2024.
 
The relationship between victims and perpetrators is also shifting. Compared to the previous year, cases involving temporary acquaintances decreased, while crimes involving strangers or unidentifiable perpetrators surged.
 
Temporary connections — such as chat partners or casual acquaintances — accounted for 28.9 percent of cases, while strangers and unknown relations followed with 26.5 percent and 24.7 percent, respectively.
 
Officials attribute this to the rise in deepfake content, which is often produced and redistributed anonymously, making it harder to trace perpetrators.
 
In total, 300,237 pieces of illicit content were deleted with assistance from the advocacy center last year. One in four of these cases, or 25.9 percent, involved the simultaneous leak of personal information, such as the victim’s name and age.
 
By platform, adult content sites accounted for the highest proportion of deletions at 43 percent. Authorities also noted that 95.4 percent of illegal pornographic websites are hosted overseas, complicating enforcement and takedown efforts.
 
The Gender Ministry plans to continue operating a dedicated response team for deepfake sex crimes and will develop educational materials for children and teens. Starting April 17, when revisions to the Sexual Violence Prevention Act go into effect, support will expand to include deletion of personally identifiable information, in addition to illicit images, in an effort to reduce secondary victimization.




Translated from the JoongAng Ilbo using generative AI and edited by Korea JoongAng Daily staff.

BY JUNG JONG-HOON [[email protected]]
Log in to Twitter or Facebook account to connect
with the Korea JoongAng Daily
help-image Social comment?
s
lock icon

To write comments, please log in to one of the accounts.

Standards Board Policy (0/250자)