Anger mounts over deepfake porn targeting celebs

Home > National > Social Affairs

print dictionary print

Anger mounts over deepfake porn targeting celebs

Following a heated scandal over Luda, a chatbot that ended up being unplugged amid controversies over its hypersexualization and unfiltered comments on sexual minorities, Korea faces another sociotechnological issue on how to tackle artificial intelligence technology that targets real, living celebrities as victims of deepfake porn.

 
On Wednesday, an anonymous person began an online petition demanding stronger punishment for websites that distribute deepfake porn involving Korean female celebrities and for people who download them.
 
"Videos featuring the victimized female celebrities are distributed on various social network services, and [they] are tortured with malicious comments of a sexually harassing and insulting nature," the petitioner wrote.
 
The petitioner mentioned how this often leaves young female celebrities, including those who are underaged, exposed to sexual predators. "Deepfake is undeniably a sexual crime," the petitioner stressed.
 
With unusually swift speed, the petition earned more than 330,000 signatures in a single day as of Thursday afternoon.
 
Growing calls urging the government to regulate deepfake porn have spread to Twitter, where fans are actively sharing hashtags such as "deepfake_strong punishment" and "publicize_illegal composite" and reporting names of online spaces or mobile apps where deepfake porn is shared and created.
 
Deepfake, a portmanteau of "deep learning" and "fake," refers to digital representations, such as videos or images, made via artificial learning or sophisticated technology that could lead viewers to wrongly perceive the manipulated media to be real.
 
While there are cases of deepfake being used positively, such as to create digital renderings of deceased family members or celebrities, the controversial technology has been criticized for being a source of fake news, fraud and defamation.
 
Celebrities and politicians whose high-resolution photos are widely circulated across the internet have usually been the biggest victims of deepfakes. But the technology has proven to be particularly threatening for female celebrities around the world.
 
A 2019 report by Amsterdam-based cybersecurity firm Sensity, formerly Deeptrace, showed that a whopping 96 percent of deepfake videos online were pornographic.
 
In a comparison of deepfake videos available on deepfake pornography websites and YouTube channels, the report said that 100 percent of the former were female, with 99 percent of them being actresses and singers working in the entertainment sector.
 
"Deepfake pornography is a phenomenon that exclusively targets and harms women. In contrast, the non-pornographic deepfake videos we analyzed on YouTube contained a majority of male subjects," the report said.
 
K-pop stars were not an exception. In fact, the report showed that 25 percent of those who appeared as subjects in videos on deepfake pornography websites were K-pop singers. While the report did not unveil the names of individuals on privacy concerns, "the second and third most frequently targeted individuals, as well as the most frequently viewed individual, were South Korean K-pop singers," it added.
 
Photos and videos sexually harassing and misrepresenting female celebrities as well as average individuals are not a new thing here. A number of K-pop singers have been mired in deepfake porn, while the notorious "Nth room" case showed that perpetrators used deepfake to create pornography featuring victims and distributed them in mobile group chat rooms.
 
The social outrage provoked by these cases prompted Korea to approve a revised Act on Special Cases Concerning the Punishment, etc. of Sexual Crimes. Under the revision that went into effect in June, those who are found to have made deepfake videos against the consent of an individual in a way that could prompt sexual desire or insult could face up to five years in jail or a fine of up to 50 million won ($45,458).
 
If they are found to have committed the crime for a commercial purpose, the prison term increases to up to seven years.
 
But despite the change to the law, calls urging stronger punishment against the technology are growing. Organizations such as the Korean Bar Association have pointed out how the law change is insufficient to address loopholes in the deepfake porn sector.
 
With the petition having earned more than the minimum threshold of 200,000 within 30 days, the presidential office is expected to release an official response within 30 days.
 
Yonhap
 
Log in to Twitter or Facebook account to connect
with the Korea JoongAng Daily
help-image Social comment?
s
lock icon

To write comments, please log in to one of the accounts.

Standards Board Policy (0/250자)