Seoul to mobilize new AI solution in fight against child pornography

Home > National > Social Affairs

print dictionary print

Seoul to mobilize new AI solution in fight against child pornography

Seoul Mayor Oh Se-hoon, left, tries out the city's AI system that automatically detects sexually exploitative material during an anniversary event of the capital's digital sex crime support center in southern Seoul held in March last year. [SEOUL METROPOLITAN GOVERNMENT]

Seoul Mayor Oh Se-hoon, left, tries out the city's AI system that automatically detects sexually exploitative material during an anniversary event of the capital's digital sex crime support center in southern Seoul held in March last year. [SEOUL METROPOLITAN GOVERNMENT]

 
The Seoul Metropolitan Government announced Wednesday that it will adopt newly developed AI technology that automatically identifies sexually exploitative material involving children to eradicate illegal content.
 
The city's think tank, the Seoul Institute, began developing the technology in March last year.
 
The capital is the first locality in Korea to adopt such a system to automatically detect and remove child sexual abuse material, or CSAM.  
 

Related Article

 
The technology produces a list of CSAM in just 90 seconds, dramatically shorter than the two hours needed for a manual search. Accuracy is expected to be enhanced by over 300 percent. The city government expects up to 300,000 videos to be monitored with the technology, double last year's figure.  
 
The technology was developed as children often refrain from telling their parents about the abuse, resulting in few reports made to the authorities, the city government said. The metropolitan government implemented AI technology to remove all sexually exploitative videos in March last year.
 
According to the digital sex crime support center run by the Seoul Foundation of Women and Family, children who reported their digital sexual abuse to authorities accounted for just 7.8 percent of the total number of digital sex crime cases targeting children filed to the center.
 
Of the 2,720 materials the center removed in the past two years, only 15.6 percent of them were deleted following requests to do so.
 
Unlike adult victims, who must report such materials themselves before they are removed, CSAM can be removed immediately without a request from the children or their parents.  
 
According to the city government, the newly developed AI-detection system recognizes the gender and age of the children. The technology can detect whether an individual is a minor even without seeing their face by analyzing objects and elements in the material, such as books, dolls, uniforms and speaking styles. 
 
The city government said that the technology would be used to search for videos worldwide, considering that such materials — which used to circulate mainly in the United States — have recently spread to other countries, including China, Russia and Vietnam.
 
Meanwhile, the city government rendered assistance to victims in 30,576 digital sex crime cases over the past two years through its digital sex crime support center.  
 
People in their 20s and teenagers accounted for the higher number, with nearly 86 percent being women.  
 
In particular, the number of digital sex crime cases against children filed at the center has risen over sevenfold in the past two years, from 2,026 in 2022 to 15,434 last year.
 
Online grooming accounted for the highest number of cases, followed by distribution and redistribution of CSAM.  
 

BY CHO JUNG-WOO [cho.jungwoo1@joongang.co.kr]
Log in to Twitter or Facebook account to connect
with the Korea JoongAng Daily
help-image Social comment?
s
lock icon

To write comments, please log in to one of the accounts.

Standards Board Policy (0/250자)