Deepfake content a new reality of investment scams

Home > Business > Finance

print dictionary print

Deepfake content a new reality of investment scams

A still from the deepfake video featuring actor Zo In-sung presented to investors by an investment fraud gang. [JOONGANG PHOTO]

A still from the deepfake video featuring actor Zo In-sung presented to investors by an investment fraud gang. [JOONGANG PHOTO]

 
A fraud victim in his 60s lost 160 million won ($120,500) after joining an investing scam group that was promoted through a YouTube video. The victim says he dropped his suspicion after seeing footage featuring renowned actors such as Zo In-sung and Song Hye-kyo.
 
Financial fraud using deepfake videos starring famous figures is increasing in Korea.
 
Deepfake content is computer-manipulated images or videos using artificial intelligence (AI) technology showing digitally-generated figures that are not easily differentiated from real footage.
 
Earlier this year, a group of investment fraudsters in Korea used a deepfake video to deceive over 100 people into investing money, resulting in hundreds of millions of won in damages.
 
One of the scammers with the title of group manager cheated the victims by saying that they are a close aid to "Battery Man" Park Soon-hyeok, well known to investors of EV batteries on YouTube.
 
The manager claimed that he would invest on behalf of the group members with the money provided.
 
The scam group also uploaded a video online for investors featuring Zo and Song praising the unauthorized trading group as a portion of investment had been donated.
 
Many investors, including the 60-something victim, believed the fraud was legitimate after seeing the video because they thought famous stars were participating alongside them.
 
In the end, however, everything was revealed to be a scam.
 
The video was a deepfake using AI technology to make the actors look like they were talking in the clip.
 
The manager was also not a secretary or had ever reached an 800 percent rate of return on investments.
 
The victim requested a refund from the management team, but they vanished after demanding commission and taxes.
 
Another victim of the fraud lost 500 million won, saying that he put his savings for his daughter’s marriage and business in the investment.
 
“The perpetrators are still on the run,” the victim added. “They may be deceiving somebody else using deepfake videos elsewhere."
 
Last November, a voice phishing gang that conned 149.1 billion won from 1891 people was busted by the police right before it attempted to scam people using deepfake footage.
 
Police discovered that the gang was in the process of making deepfake videos with an actual prosecutor’s face and voice to deceive people.
 
Concerns of financial fraud using deepfake videos, however, are not only an issue in Korea.
 
According to media reports Wednesday, a Hong Kong company employee accidentally sent 200 million Hong Kong dollars ($25.5 million) early this month after being scammed by a deepfake video with the company’s finance officer appearing.
 
Last December, the likeness of Singapore’s Prime Minister, Lee Hsien Loong, was used for fraud and in November, Australian entrepreneur Dick Smith was rendered in a deepfake video.
 
Both figures were depicted as being interviewed to promote an investment platform, which led to actual fraud damage.
 
Unfortunately, there is no way to prevent financial fraud using deepfake content other than “not being deceived.”
 
Measures such as separately marking deepfake content as such are under discussion, but there is no use if the criminals erase it.
 
Many consider the most effective way to prevent crime using deepfake videos to be detection beforehand, but limitations exist as detecting technology is not able to keep up with the rapid development of generating technology.
 
“A high possibility exists of deepfake videos and voices being used for contactless financial transactions,” Park Ji-hong, a researcher at the Hana Institute of Finance, said to the JoongAng Ilbo. “We must put effort into securing detecting technology that can identify deepfake content beforehand.”
 
Some say policies regarding such new types of fraud need improvement as it is almost impossible to retrieve the victim’s damage from the offenders.
 
“It is worth discussing measures to recover damages for victims by raising funds from the confiscated assets of criminals,” Kim Gye-hwan, a lawyer at the law firm Gamwoo, said.

BY OH HYO-JEONG [kim.jiye@joongang.co.kr]
Log in to Twitter or Facebook account to connect
with the Korea JoongAng Daily
help-image Social comment?
s
lock icon

To write comments, please log in to one of the accounts.

Standards Board Policy (0/250자)