How to fix social media

Home > Opinion > Columns

print dictionary print

How to fix social media


David Kaye
*The author is a professor at the UC Irvine School of Law and the UN Special Rapporteur on the right to freedom of expression.

The Cambridge Analytica/Facebook scandal may have changed the way millions of people perceive the risks to privacy when they go online. But it could have obscured an equally profound digital age debate: widespread resistance to internet companies’ role as the global speech police of the digital age. The future of free speech depends on getting this debate right.

Social media, messaging and search platforms offer real value. They provide connections, information and security for people who might not otherwise have them, such as sexual minorities in traditional societies, reporters in authoritarian environments or dissenters in repressive regimes. They should be celebrated for this kind of openness.

And yet the most influential corporations in this sphere wield extraordinary power from a distance. They develop rules, standards and guidelines, often in Silicon Valley, to determine for people around the world the appropriate boundaries of expression. In many places, U.S. companies provide the dominant source of news and information, having an enormous impact on public life.

Much as they may try, they are often out of touch with local and national concerns in the places where they operate. Consider, for example, a dystopia in which no regional or local media outlets exist in the United States — with one exception, let’s say The New York Times. Many Americans read the Times for national and global news, but its coverage of state and municipal issues cannot convey the depth or consistency accomplished by local media. This is how communities around the world often perceive the dominance of American internet companies.

In places like Myanmar, where activists note that “Facebook is the internet,” the companies lack the linguistic and cultural expertise to distinguish a racist word from an ordinary one. As a result, they may over-regulate, censoring the good with the bad. Their automated systems, the artificial intelligence (AI) designed to flag inappropriate content and display the “best” results and the information the company thinks you really want to see, are largely invisible to users.

Even democratic societies resent this power over their public space. As one liberal European politician put it, “No one wants a Ministry of Truth, but I am also not reassured when Silicon Valley or Mark Zuckerberg are the de facto designers of our realities or of our truths.”
Regulation is coming, but in the hands of governments, this often means pressure for company censorship. The European Union and some of its member states are moving to constrain the companies’ role as speech regulators, developing new rules on issues ranging from hate speech and extremism to “fake news,” vague concepts subject to abuse in the hands of governments or corporations.

The current situation is untenable for countless societies and individuals worldwide. As governments propose new laws, companies will over-compensate and limit the space for debate, art, politics and other kinds of expression. The companies will hire more people with language, political and cultural expertise in order to moderate content globally, but the sites will still be platforms that are run by well-meaning people who are nonetheless detached from the lived experiences of those whom they are regulating.

What should the companies do? In my role as the UN Special Rapporteur on the right to freedom of expression, I recently completed the first report for the UN Human Rights Council on how governments and social media companies can enable free speech in an era of disinformation and online extremism.

Among my main proposals are the following:

First, internet companies need to involve local communities in governing their platforms. The corporations as they are currently configured cannot rule public space everywhere. They must find ways to devolve authority to local actors — not to governments, but to their users. Hiring teams of experts alone simply doesn’t cut it. Steps like diversifying leadership, enabling greater local content moderation not outsourced to contractors and engaging deeply with the communities where they operate are essential.

If the companies cannot make these kinds of changes, or if they prove superficial, they may need to explore how they might be able to spin off national versions of their platforms.

Second, the companies must disclose radically more information about the nature of their rule making and enforcement concerning expression on their platforms. Greater disclosure means individual empowerment, giving people an opportunity to provide genuine critiques of how those rules apply and how the companies get it wrong, in specific countries.

It’s true that many of the companies, understanding their impact on public space around the world, already have been expanding what they reveal about government and individual requests for the takedown of content, but they can do more. They should develop a kind of public social media case law, such that users and activists can challenge or accept the actual decisions the companies are making. Further, in public regulation of speech, we expect some form of accountability when authorities get it wrong; we should see that happening online, too.

Finally, the companies make claims to global roles, so they should adopt global standards — not the First Amendment, and not terms of service allowing them complete discretion. They should apply human rights law, which provides global standards protecting everyone’s right to “seek, receive and impart information and ideas of all kinds, regardless of frontiers.” A total of 171 countries are parties to the International Covenant on Civil and Political Rights (Iccpr), including the United States. (The Trump administration’s Tuesday announcement that it has withdrawn from the UN Human Rights Council does not affect Washington’s commitment to the Iccpr.)

Those rules would provide better grounding for company operations and allow real capacity to push back against governments seeking to interfere with freedom of expression. A third-party auditor of some sort — something like a social media council — could regularly evaluate company compliance with human rights norms. Indeed, this is how democratic societies in many parts of the world enable media self-regulation, through national “press councils” that evaluate claims of unethical or wrongful media behavior.

Opaque forces, both corporate and governmental, are shaping the ability of individuals across the globe to exercise their freedom of expression. With looming governmental intervention, the companies need to change in order to meet the threats tthat hey pose in this digital age.
Log in to Twitter or Facebook account to connect
with the Korea JoongAng Daily
help-image Social comment?
lock icon

To write comments, please log in to one of the accounts.

Standards Board Policy (0/250자)