Load WordPress Sites in as fast as 37ms!

Facebook’s Pitch to Congress: Section 230 for Me, But not for Thee

Fight Censorship, Share This Post!

As Mark Zuckerberg tries to sell Congress on Facebook’s preferred method of amending the federal law that serves as a key pillar of the internet, lawmakers must see it for what it really is: a self-serving and cynical effort to cement the company’s dominance.

In prepared testimony submitted to the U.S. House of Representatives Energy and Commerce Committee before a Thursday hearing, Zuckerberg proposes amending 47 U.S.C. § 230 (“Section 230”), the federal law that generally protects online services and users from liability for hosting user-generated content that others believe is unlawful.

The vague and ill-defined proposal calls for lawmakers to condition Section 230’s legal protections on whether services can show “that they have systems in place for identifying unlawful content and removing it.” According to Zuckerberg, this revised law would not create liability if a particular piece of unlawful content fell through the cracks. Instead, the law would impose a duty of care on platforms to have adequate “systems in place” with respect to how they review, moderate, and remove user-generated content.

Zuckerberg’s proposal calls for the creation of a “third party,” whatever that means, which would establish the best practices for identifying and removing user-generated content. He suggests that this entity could create different standards for smaller platforms. The proposal also asks Congress to require that online services be more transparent about their content moderation policies and more accountable to their users.

An Anti-Competitive Wedge

The proposal is an explicit plea to create a legal regime that only Facebook, and perhaps a few other dominant online services, could meet. Zuckerberg is asking Congress to change the law to ensure that Facebook never faces significant competition, and that its billions of users remain locked into its service for the foreseeable future.

It’s galling that at the same time Zuckerberg praises Section 230 for creating “the conditions for the Internet to thrive, for platforms to empower billions of people to express themselves online,” he simultaneously calls on Congress to change the law to prevent any innovation or competition that could disrupt Facebook’s market position. Zuckerberg is admitting that after Facebook has benefited from Section 230, he doesn’t want any other competitor to do the same. Rather than take up Facebook’s proposal, Congress should instead advance meaningful competition and antitrust reforms to curtail the platform’s dominance.

Moreover, Zuckerberg’s proposal comes just before a congressional hearing that is ostensibly about the problems Facebook has created. These problems exist precisely because of Facebook’s dominance, anti-competitive behavior, and terrible privacy and content moderation practices. So in response to Facebook’s significant failures, Zuckerberg is telling Congress that Facebook is the solution. Congress should respond: Absolutely not.

A Flawed Proposal

On the merits, Zuckerberg’s proposal — though light on specifics — is problematic for several reasons.

First, the proposal overlooks that the vast majority of online services that host user-generated content do not have the technical, legal, or human resources to create systems that could identify and remove unlawful content. As Mike Masnick at TechDirt recently wrote, the internet is made up of far more diverse and less-resourced services than Facebook. Congress must recognize that the legal rules it sets for online services will apply to all of them. Zuckerberg proposes that the required “adequate systems” be “proportionate to platform size;” but size is only one factor that might correlate to an intermediary’s ability to implement such systems. By punishing growth, a size-scaled system would also discourage the development of nonprofit intermediary models that might compete with and replace those that profit greatly off of their users’ data. What would actually be necessary is an assessment of whether each individual intermediary, based on its numerous characteristics, has provided adequate systems. This is essentially a legal negligence standard – asking the question “Has the intermediary acted reasonably?” – and such standards have historically and legally been found to be insufficiently protective of freedom of speech.

Second, Zuckerberg’s proposal seems to require affirmative pre-screening and filtering of content as an “adequate system.” As we have written, filtering requirements are inherently privacy invasive and almost always rely on faulty and nontransparent and unaccountable automation. And of course, they are extremely burdensome, even at a small scale.

Third, the standards under Zuckerberg’s proposal would be unworkable in practice and result in even greater online censorship. Content moderation at scale is impossible to do perfectly and  nearly impossible to do well. Automated tools and human reviewers make scores of mistakes that result in improper removal of users’ content. If services are required by law to have systems that remove users’ content, the result will be a world in which much greater volumes of user speech will be removed, as services would rather censor their users than risk losing their legal protections.

Fourth, the proposal would not even address the problems Facebook is now being called out for. Zuckerberg calls for Section 230 protections to be conditioned on having systems in place to remove “unlawful content”; but most of the examples he addresses elsewhere in his testimony are not illegal. Hate and violence, misinformation, and community standards for groups are largely protected speech. Platforms like Facebook may and should want to actively moderate such content. But the speech is not usually “illegal,” a narrow subset of speech unprotected by the First Amendment.

Fifth, Zuckerberg calls for a “third party” to define the “adequate systems” an intermediary must adopt. We saw a similar proposal recently with the original version of the EARN IT Act. We opposed a standards-setting body there because it was going to be dominated by law enforcement officials who desire to break end-to-end encryption. Although Zuckerberg does not identify the membership or composition of his proposed third party, we worry that any entity created to address online content moderation could similarly be captured by special interests who do not represent internet users.

Transparency, Yes Please

We appreciate that Zuckerberg is calling on online services to be more transparent and responsive to user concerns about content moderation. EFF has been actively involved in an effort to push these services to adopt a human rights framing for content moderation that includes adequate notice to its users and transparency about the platform’s practices. Yet we do not believe that any requirement to adopt these practices should be linked to Section 230’s protections. That’s why we’ve previously opposed legislation like the PACT Act, an initial version of which compelled transparency reporting. It’s also worth noting that Facebook lags behind its peers on issues of transparency and accountability for censoring its users’ speech, a 2019 EFF review found.

Zuckerberg’s proposal to rewrite Section 230 joins a long list of efforts to overhaul the law. As we have said, we analyze every fully formed proposal on its merits. Some of the proposed changes start from a place of good faith in trying to address legitimate harms that occur online. But Zuckerberg’s proposal isn’t made in good faith. Congress should reject it and move on to doing the real, detailed work that it has to do before it can change Section 230.


Fight Censorship, Share This Post!

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.