fbpx

Who should moderate the internet? 

by | Feb 26, 2021 | Public Relations

Should any single entity be responsible for moderating the speech of billions of people worldwide—or for policing the internet?

Earlier this month, three of America’s most prominent tech companies—Amazon, Apple, and Google—shut down Parler over its apparent lack of content moderation, which allowed the proliferation of conspiracy theories that likely inspired many rioters who stormed the U.S. Capitol on January 6.

Amazon said Parler was “unable or unwilling to promptly identify and remove this content.” Google removed Parler from the Google Play Store because it didn’t “implement robust moderation for egregious content.”

With this decision, these Big Tech companies introduced several questions about the role of content moderation in the social media ecosystem, where billions of users share a mind-boggling volume of content each day.

Who should moderate the internet? 

This raises some significant questions:

Has Apple reviewed the content moderation plans of every social network in its App Store? Does Google set a specific threshold for when a social network is too large to continue without a sizable content moderation team?

Over the years, I’ve heard many people say, “If you don’t like Facebook, you should go create something better.”

Our digital marketplace allowed that to happen … until now

So what upfront investment in content moderation will we now expect of someone who wants to build a new social network?

Facebook has spent billions on content moderation, including developing best-in-class tools and employing over 30,000 people to review user content. Granted, Facebook has almost 3 billion monthly active users, so the total amount of content it must moderate is far more substantial than at other social networks.

But if another network adds millions of members in just a few days (e.g., Parler), is it expected to maintain a content moderation process that scales at a similar pace as user growth?

And how do you handle moderation of messages through encrypted messaging apps, such as Signal, Telegram, or Facebook-owned WhatsApp? What about live audio apps like Treehouse?

Suppose content moderation becomes a major barrier to entry for social networking. In such a case, the behemoths who can invest billions into content moderation would stand to benefit from a newfound lack of competition.

And ironically, how will that factor into antitrust claims against Facebook?

If lawmakers mandate Facebook spin-off its acquired entities, including Instagram and WhatsApp, as separate businesses, would that force Instagram to spend billions on building its own moderation system? As one of Facebook’s subsidiaries, Instagram currently gets the benefit of the parent company’s robust moderation system.

Requiring a spun-off Instagram to spend billions on moderation or risk allowing dangerous content on its platform presents a significant downside for the U.S. federal government as it moves forward with an antitrust case against Facebook. Would that be enough downside risk to counter the perceived dangers of a potential Facebook monopoly?

There are no easy answers to these challenging questions. But we need to start addressing them now—and fast—because the future of truly free speech is at stake.

Ryan Cohn
Ryan Cohn is executive vice president at Sachs Media. He advises Fortune 500 companies, trade associations, and global thought leaders on communications and digital media advocacy.

RECENT ARTICLES