[MAGNIFYING GLASS] Rushed 'Nth room law' unlikely to actually stop criminals
Last month Korean lawmakers passed a bill aimed at preventing digital sex trafficking and unauthorized sharing of sexually-explicit videos and photos taken without consent in the wake of the "Nth room" case, where multiple secret chat rooms on the Telegram messenger app were used to sexually exploit dozens of women — many of them teenagers.
The bill was passed in just 16 days, but the rush created an amended law riddled with loopholes and vague definitions, essentially leaving it down to local internet giants such as Naver and Kakao to solve the problem.
The internet service providers are required to come up with “technical and administrative” measures to curb the circulation of illegally shot videos on their platforms. If they fail to do this, they could be fined up to 3 percent of their sales.
There are obvious holes in this solution. Firstly, the law fails to actually target the digital platforms where the circulation of the videos and photos predominantly took place — private chat rooms on instant messaging apps such as KakaoTalk, Line and Telegram, or closed online communities.
The proposal will come into force only for open online forums including Daum and Naver’s "cafe" communities, according to the Korea Communications Commission, a state-run communications regulator. The regulator released a press release in response to concerns about the possible invasion of privacy if the law covered private chat rooms.
Both the Nth room case and the sharing of unauthorized sexual content by celebrities embroiled in the Burning Sun scandal last year took place in private chat rooms.
But, unsurprisingly, there have been “little to no cases” of illicit file sharing in those open communities or blogs, according to a spokesperson at Naver.
When it comes to chat rooms, law enforcement agencies believe that Telegram is most favored for criminal activity because of its vaunted security safeguards. The chat rooms used in the Nth room case were on Telegram.
But Telegram is a foreign company with its servers scattered across different countries, so the Korean law doesn't affect it.
The limitation prompted three associations representing local internet companies to publish a statement denouncing the passage of the amendment on May 20.
Another issue is that the IT companies have neither the authority nor the capabilities to determine whether somebody has permission to film or share footage or photos.
Putting aside the potential illegality of "monitoring" content on public online forums, the public relations officials at both Kakao and Naver said that not only is the task of proving the illegality of visual content extremely tricky, but should be the responsibility of law enforcement agencies.
As such, the only way service providers have of policing the sharing of illicit content is if victims report the issue.
Reporting systems are already available and widely used at Naver and Daum.
Once a complaint is submitted by a user, the platform provider must immediately pull down the reported posting. A mediation process then begins, followed by the final decision of the platform operators. This process based on public forums is unlikely to have any impact on the sharing of illicit videos and images.
This half-baked policy is just the latest instance of Korean regulators burdening corporations with extensive rules and regulations in a bid to tackle an issue that may be outside of their control.
Back in 2015, the prosecution indicted former Kakao CEO Lee Sir-goo on charges of negligence in connection with child pornography shared on the Kakao messaging app.
Although Lee was acquitted in 2019, the charge led him to resign in 2015.
Acknowledging the doubts surrounding the proposal, the Korea Communications Commission said that it will invite the tech giants to devise specific enforcement ordinance to detail the target and scope of the reform. The adjustment process will likely take around six months before the law is taken into effect, according to a source at Kakao.
But given the limited role that IT companies can play in detecting illegal content, the bill, called by advocates the "Nth room prevention law," will likely not be able to fulfill its promise.
Authorities and lawmakers have taken a big, visible step, but now they need to find a more meaningful way to actually tackle the issue effectively.
The pressing problem is that the reformed law will rarely play any role in diminishing these crimes and the discussion should have been directed toward more effective, viable measures such as strengthening penalties against perpetrators of online sex crimes.
BY PARK EUN-JEE [email@example.com]