YouTube explains its content moderation policies

Home > Business > Tech

print dictionary print

YouTube explains its content moderation policies

A YouTube logo at the YouTube Space LA in Playa Del Rey, Los Angeles, California, United States [REUTERS/YONHAP]

A YouTube logo at the YouTube Space LA in Playa Del Rey, Los Angeles, California, United States [REUTERS/YONHAP]

 
Google defended its censorship policies saying it only yanks or restricts videos if it has a valid reason and that it offers an opportunity for appeal.
 
What private companies can and should do with postings considered offensive is a major issue globally, with claims of arbitrary and politically driven decisions being tested in courts of law and public opinion.
 
Twitter's fate is in the balance as users respond to the moderation policies of the new owner and as questions continue to be raised about its decision not to allow links to certain newspaper stories during the 2020 elections.
 
To address how Google deals with the question, Matt Halprin, vice president and global head of trust and safety at YouTube, spoke with reporters in the Asia Pacific region on Tuesday.
 
“In the horrific series of events in Itaewon, we removed content and age-restricted thousands of pieces of content,” Halprin said.
 
YouTube has been regularly updating its community guidelines since it was started in 2005, especially since the service's fast growth since the 2010s. The service has over 2 billion monthly logged-in users globally and an estimated number of over 41 million users in Korea, which is more than 80 percent of the whole population.
 
Tuesday's session was held to explain to reporters how YouTube makes the rules and under what special circumstances it interferes directly with user activities.
 
Chrysanthemums are laid out at a memorial for the Itaewon tragedy in central Seoul on Nov. 5. [NEWS1]

Chrysanthemums are laid out at a memorial for the Itaewon tragedy in central Seoul on Nov. 5. [NEWS1]

A citizen mourns the deaths at the Itaewon tragedy at a memorial established in Noksapyeong Station, Line No. 6, central Seoul on Nov. 6. [NEWS1]

A citizen mourns the deaths at the Itaewon tragedy at a memorial established in Noksapyeong Station, Line No. 6, central Seoul on Nov. 6. [NEWS1]

 
One such example was during the Covid-19 when people posted information on YouTube that Google felt was false and potentially harmful.
 
“During Covid-19 and the early days of the pandemic, we began to see claims on the platform that 5G cellular technology was linked to the spread of the virus,” he said. “And we actually saw cases across the United Kingdom where individuals were going out and physically damaging cell towers. So we quickly made these claims violative of our policies and removed them.”
 
Guidelines are formed around the four “R”s — remove harmful content, raise credible content, reduce content with harmful potential and reward trustworthy creators — to prevent egregious real-world harm.
 
Halprin said that there must be a high probability of the material inflicting real physical harm to the offline world. Just because something's offensive doesn't mean it will be deemed harmful, he added. It must have real-world consequences for it to be removed.
 
Millions of channels and videos are removed every month according to the community guidelines, primarily by the automated flagging system developed by YouTube.
 
Detailed reasons can be found online. The five major categories are: spam and deceptive practices; sensitive content; violent or dangerous content; regulated goods; and misinformation.
 
From July to September, 91.2 percent of the 5.8 million channels were removed in total were because they were considered to be spam, misleading or scams. Three percent were due to nudity or sexual reasons and two percent were due to child safety.
 
During the same period, 5.6 million videos were removed, of which 94.6 percent were detected by YouTube’s software and 5.3 percent were reported by users and screened by human staff. The rest were either requested by organizations or government agencies.
 
Any creator whose content or channel has been shut down can appeal. Of the 5.6 million videos, YouTube received 271,000 appeals and reinstated 29,000 of them. Two-thirds of the videos are auto-detected within seconds, taken down before they have less than 10 views.
 
An image of inappropriate videos on YouTube [YONHAP]

An image of inappropriate videos on YouTube [YONHAP]

 
Machines can detect any violation fast, but not the differences between the context of the content and the cultural contexts in the real world, according to Jennifer Flannery O'Connor, vice president of product management at YouTube.
 
If someone uploads a video of an Adolf Hitler speech, that video can be taken down for hate speech, she said. But if that video is used as footage for a documentary on wars, then that is deemed harmless — but not to the machines.
 
“The enforcement of our policies is a partnership between humans and machines,” she said.
 
“Most of the content that we remove from YouTube is firstly proactively detected by our machines. But we also have the ability for users to flag content to us that they think violates our guidelines. We use those user flags that we receive as one important signal to help us identify potentially violative content.”
 
In principle, the same rules apply everywhere around the world. YouTube can step in to block or age-restrict videos from viewers in certain regions in urgent circumstances like the Itaewon tragedy, but the company keeps to the idea of consistency as the basic rule.
 
“It’s not going to be exactly perfect for everybody because the balance of freedom of expression versus what someone might call hate speech differs in different countries,” Halprin said. “But we feel that one global policy is a good way to ensure consistency and a way to ensure that YouTube’s balance of freedom of expression applies globally, as opposed to differing by each country.”

BY YOON SO-YEON [yoon.soyeon@joongang.co.kr]
Log in to Twitter or Facebook account to connect
with the Korea JoongAng Daily
help-image Social comment?
s
lock icon

To write comments, please log in to one of the accounts.

Standards Board Policy (0/250자)