The Ultimate Managed Hosting Platform

Facebook’s Social Media Council Leaves Key Questions Unanswered

Fight Censorship, Share This Post!

Facebook took big step forward this week in its march to create an “oversight board” to help vet its more controversial takedown decisions, publishing more details about how it will work. Both Facebook and its users will be able to refer cases to the Board to request its review. Is this big step a big deal for online speech?

Maybe not, but it’s worth paying attention. A handful of tech companies govern a vast amount of speech online, including the platforms we use to get our news, form social bonds, and share our perspectives. That governance means, in practice, making choices about what users can say, to whom. Too often—on their own or under pressure—the speech police make bad choices, frequently at the expense of people who already struggle to make their voices heard and who are underrepresented in the leadership of these companies.

EFF has proposed a few ways to improve the way speech is governed online, ever-vigilant to the fact that your freedoms can be threatened by governments, corporations, or by other private actors like online mobs. We must ensure that any proposed solution to one of those threats does not make the others even worse.

We have six areas of concern when it comes to this kind of social media council, which we laid out earlier this year in response to a broader proposal spearheaded largely by our friends at Article 19. How does Facebook’s version stack up?

  • Independence: A subgroup of board members will be initially selected by Facebook, and they will then work with Facebook to recruit the rest (the goal is to ultimately have 40 members on the Board.) Thus, Facebook will have a strong influence on the makeup of first Board. In addition, the Board is funded through a “Trust,” appointed and paid for by Facebook. This structure provides a layer of formal independence, but as a practical matter Facebook could maintain a great deal of control through its power to appoint trustees.
  • Roles: Some have argued that an oversight board should be able to shape a platform’s community standards. We’ve been worried that because such a board might have no more legitimacy to govern speech than a company, it should not be given the power to dictate new rules under the guise of independence. So we think an advisory role is more appropriate, particularly given that the Board is supposed to adhere to international human rights principles.
  • Subject matter: The Oversight Board is to interpret Facebook’s policies, which will hopefully improve consistency and transparency, and may suggest improvements to the rules governing speech on Facebook. We hope that they press Facebook to improve its policies, just as a wide range of advocates do (and should continue to do).
  • Jurisdiction: One of the problems with corporate speech controls is that rules and expectations can vary by region. The Facebook proposal suggests that a panel for a given case will include someone from the relevant “region,” but it is unclear how a group of eleven to forty Board members can adequately represent the diverse viewpoints of Facebook’s global userbase.
  • Personnel: As noted, the composition of the Board remains an unknown. Facebook has said it will strive for a broad diversity of geographic, gender, political, social and religious representation and perspectives.
  • Transparency: It will certainly be a step forward if the Board’s public opinions give us more insight into the rules that are supposed to govern speech at Facebook (the actual, detailed rules used internally, not the general rules made public on Facebook.com). We would like to see more information about what kinds of cases are being heard and how many requests for review the Board receives. New America has a good set of specific transparency suggestions.

In short, Facebook’s proposal could improve the status quo. The transparency of the Board’s decisions means that we will likely know more than ever about how Facebook is making decisions. It remains to be seen, though, whether Facebook can be consistent in its application of the rules going forward, as well as how “Independent” the Oversight Board can be, and many of the important details about who will make up the Board and whether it will take the necessary steps to understand local and subcultural norms. We and other advocates will continue to press Facebook to improve the transparency and consistency of its procedures for policing speech on its platform, as well as the substance of its rules. We hope the Oversight Board will be a mechanism to support those reforms and push Facebook towards better respect for human rights.

What it won’t do, however, is fix the real underlying problem: Content moderation is extremely difficult to get right, and at the scale at which Facebook is operating, it may be impossible for one set of rules to properly govern the many communities that rely on the platform. As with any system of censorship, mistakes are inevitable.  And although the ability to appeal is an important measure of harm reduction, it’s not an adequate remedy for having fair policies in place and adhering to them in the first place.


Fight Censorship, Share This Post!

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.