Bad Content Moderation Is Bad, And Government Interference Can Make It Even Worse

Fight Censorship, Share This Post!

This week, the House Energy and Commerce Committee held a hearing titled “Preserving Free Speech and Reining in Big Tech Censorship.” Lawmakers at the hearing trotted out the usual misunderstandings of these concepts, and placed the blame on Section 230, the law that actually promotes free speech online.

However, buried in these misunderstandings from Congress, and most of the witnesses called to testify, was a genuinely serious problem: Government officials keep asking online services to remove or edit users’ speech, raising the specter of unconstitutional interference with private platforms’ moderation decisions. And worse, the public lacks any transparency into how often this occurs.

Regardless of your ideological preference, we should always worry about government coercion that results in censoring users’ speech online, which violates the First Amendment and threatens human rights globally. The government is free to try to persuade online services to remove speech it believes is harmful, but the choice to remove users’ speech should always remain with the platform.

So Congress is right to investigate the relationship between platforms and the government, and both should be more transparent about official requests to remove users’ content.

The First Amendment And Section 230 Enable Content Moderation

One witness at the hearing, Dr. Jay Bhattacharya, discovered that much of what he posted about COVID-19 mitigation strategies was being moderated by Twitter to reduce its visibility, at the request of government health officials. The decisions made about his Twitter handle were not explained to him at the time, but were later shown to him after Twitter changed ownership. This lack of transparency is problematic. That’s why, for years, groups such as EFF have pressed Big Tech companies to be more clear to users about their decision making processes, especially when those decisions start with a government request.

The claim often made by some Members of Congress, including Committee Chair Cathy McMorris Rodgers (R-WA), is that Section 230 lies at the heart of the censorship. The statement misunderstands the issue. Platforms are private entities and, as we’ve explained many times before, private actors, even Big Tech, have a First Amendment right to take down speech they do not wish to carry. As Spencer Overton told the lawmakers at the hearing, the reality is that the First Amendment gives private companies the “right to exclude content as they see fit.

What Section 230 does is protect those companies, as well as much smaller companies and users, from other types of civil liability for their decisions to distribute third-party content and to moderate that content.

That’s not to say we shouldn’t worry about the power of Big Tech to shape public discourse. At EFF, we have suggested a number of constitutionally permissible ways to confront their power, such as revising competition law and antitrust, while updating our privacy laws. But so long as Congress remains focused on things beyond their power, more time will unfortunately be wasted on these unproductive efforts.

Government Interference In Content Moderation Is A Real Danger

We don’t need to debate all the details when government officials get involved in content moderation. It’s deeply concerning anytime it happens, especially behind the scenes. In general, private actors should be left alone to make decisions on how to manage their online communities. Government officials can make recommendations, but they should do so openly, and in public. But working behind the scenes to shape what content should stay up and what should go down is problematic. A productive discussion by Congress would focus on restraining the government’s shadowy moderation activities and making them transparent—even when officials’ intentions are good.  


Fight Censorship, Share This Post!

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.