While an emergency application to block the Texas law is pending before the Supreme Court (we filed a brief urging the court to put it back on hold), we are relieved by the 11th Circuit’s ruling in NetChoice v. Florida. The court recognized two crucial First Amendment principles flouted by Florida’s law: that platforms are private actors making editorial decisions, and that those decisions are inherently expressive. When platforms remove or deprioritize posts, they are engaging in First Amendment-protected speech, the court said.
Tornillo Shows the Way
Florida S.B. 7072, signed into law by Gov. Ron DeSantis a year ago, prohibits large online intermediaries from terminating politicians’ accounts or taking steps to deprioritize posts by or about them or posts by “journalistic enterprises” (defined to include entities that have sufficient publication and viewership numbers). The law would override the sites’ own content policies. Florida passed the law to retaliate against platforms for supposedly censoring conservative voices. Interestingly, the 11th Circuit noted that the perceived bias in platforms’ content-moderation decisions “is compelling evidence that those decisions are indeed expressive” First Amendment-protected conduct.
In ruling that the law likely violates the First Amendment, the 11th Circuit pointed to the Supreme Court’s unanimous 1974 ruling in Miami Herald v. Tornillo, which established that the editorial judgements made by private entities about whether and how to disseminate speech are protected under the constitution. In Tornillo, the court rejected a Florida law requiring newspapers to print candidates’ replies to editorials criticizing them. Subsequent Supreme Court rulings, protecting decisions by parade organizers and cable operators about what third party-created content they disseminate, further underpinned this free speech principle, the 11th Circuit said.
EFF and Protect Democracy filed an amicus brief with the 11th Circuit arguing that internet users are best served when the First Amendment protects platforms’ rights to curate speech as they see fit, free of government mandates. That right allows for a diverse array of forums for users, with unique editorial views and community norms. The court agreed, recognizing that “by engaging in this content moderation, the platforms develop particular market niches, foster different sorts of online communities, and promote various values and viewpoints.”
We were pleased to see the three-judge panel use examples we cited in our brief to demonstrate the variety of communities platforms seeks to appeal to through moderating content, from Facebook, which removes or adds warnings to posts it considers hate speech, Roblox, which prohibits bullying and sexual content, and Vegan Forum, which allows non-vegans but doesn’t tolerate “member who promote contrary agendas,” to ProAmericaOnly, which promised users “NO BS/NO LIBERALS.”
Platforms Aren’t Mere Hosts or Dumb Pipes
Florida argued that S.B. 7072 doesn’t violate the First Amendment because platforms don’t review most posts before publication and therefore aren’t making expressive decisions as to user content. But the law doesn’t target speech that isn’t reviewed—it is specifically aimed at speech that is removed or deprioritized, the court noted.
The panel also knocked down Florida’s argument that the law doesn’t implicate free speech rights because it only requires platforms to host speech and not necessarily agree with it. The court said that unlike the private entities—shopping centers and law schools—in the cases cited by the state, social media platforms have expression as their core function, which is violated by S.B 7072.
Finally, the court rejected Florida’s argument that large social media services are common carriers—entities that, in the communications context, provide facilities so anyone and everyone can communicate messages of their own design and choosing. While platforms sometimes say they are open to anyone, in practice they have always required users to accept their terms of service and community standards. So, in reality, social media users are not free to speak on the platform in any way they choose—they can’t post comments that violate the platform rules, the court noted.
The 11th Circuit also cited Supreme Court precedent in Reno v. ACLU, where the high court said internet forums have never been subject to the same regulation and supervision as the broadcast industry. Further, Congress excluded computer services like social media companies from the definition of common carrier in the Telecommunications Act of 1996.
Florida can’t just decide to make social media platforms into common carriers either, the 11th Circuit said, declaring “neither law nor logic recognizes government authority to strip an entity of its First Amendment right merely by labeling it a common carrier.”
Borrowing language from our brief, the court said social media platforms have historically exercised editorial judgment by moderating content, and a state can’t force them to be common carriers without showing there’s a compelling reason to strip them of First Amendment protections. This is important because it recognizes that platforms have curated content since day one.
The court let stand, at least for now, S.B. 7072 provisions requiring platforms to inform users before changing moderation rules, give users who request it the number of people who have viewed their posts, and give deplatformed users an opportunity to retrieve their data. EFF had argued, and the lower court had ruled, that while some of these transparency requirements may be acceptable in another context, they were impermissible here as part of S.B. 7072’s overall unconstitutional retaliation.
The 11th Circuit has drawn important lines in the sand. But it won’t be the last. With lawmakers in Georgia, Ohio, Tennessee, and Michigan considering similar bills, it’s likely more courts will be called on to decide whether platforms have a First Amendment right to moderate content on their sites. We hope they follow the 11th Circuit’s lead.
The Electronic Frontier Foundation is the leading nonprofit organization defending civil liberties in the digital world. Founded in 1990, EFF champions user privacy, free expression, and innovation through impact litigation, policy analysis, grassroots activism, and technology development. We work to ensure that rights and freedoms are enhanced and protected as our use of technology grows. Visit https://www.eff.org