Facebook’s moral sand castles

Home > Opinion > Meanwhile

print dictionary print

Facebook’s moral sand castles

테스트

This is going too far. A man killed his 11-month-old daughter by hanging in Phuket, Thailand, broadcasting it on Facebook Live on April 24. Facebook Live is a live broadcasting service on social media that Facebook started to offer in April 2016. Anyone can broadcast live with a smartphone.

The man took his own life after killing his daughter. The murder-suicide illustrated the darkest side of social media. What’s more frightening is that copycat crimes may follow.

Experts say that the murder in Thailand is likely influenced by the murder in Ohio about a week earlier. On April 16, a 37-year-old man posted a video on Facebook with a plan to kill, and then he killed a 74-year-old man, a stranger he ran into. He recorded the killing and posted the footage on Facebook, which spread over the internet for 2 hours.

But it took one and a half hours until it was reported to Facebook, and another 30 minutes for Facebook to close the account. It is said to have been viewed more than 1.6 million times. The suspect killed himself.

The two cases are too similar and happened too close to be considered a coincidence. That’s why authorities are concerned by the possibility of copycat crimes. Both cases ended in suicides. With the highest suicide rate among the members of the OECD, Korea is worried as the world appears closer on social media than in reality, and perhaps its impact is therefore more powerful.

There have been snuff films and films of murder or rape in the past, but a majority of social media users may view heinous footage on Facebook by accident while searching for travel photos, restaurant information or news about their friends.

Hate speech, fake news and murder videos have triggered a discussion on social-media content restriction in the IT industry in the United States and Europe. But there are no effective ways to go about it. Censorship will infringe on freedom of speech, and posterior measures are too late.

Either method will incur considerable costs and there are no clear technical alternatives. At present, AI cannot distinguish between pornography and breastfeeding.

Social media companies should be more accountable. Screening violence is much easier than typing what the brain is thinking or developing flying cars.

One can guess why they are not acting so eagerly. According to the Communications Decency Act of 1996, internet service providers are not legally liable for the content posted by users. These legal grounds are inappropriate for the reality where social media virtually functions as media. If Facebook doesn’t act quickly, users may act first.

JoongAng Ilbo, April 27, Page 34

*The author is a deputy business news editor of the JoongAng Ilbo.

PARK HYUN-YOUNG
Log in to Twitter or Facebook account to connect
with the Korea JoongAng Daily
help-image Social comment?
s
lock icon

To write comments, please log in to one of the accounts.

Standards Board Policy (0/250자)