Even if you think that online intermediaries should be more proactive in detecting, deprioritizing, or removing certain user speech, the requirements on intermediaries to review all content before publication—often called “general monitoring” or “upload filtering”—raises serious human rights concerns, both for freedom of expression and for privacy.
General monitoring is problematic both when it is directly required by law and when, though not required, it is effectively mandatory because the legal risks of not doing it are so great. Specifically, these indirect requirements incentivize platforms to proactively monitor user behaviors, filter and check user content, and remove or locally filter anything that is controversial, objectionable, or potentially illegal to avoid legal responsibility. This inevitably leads to over censorship of online content as platforms seek to avoid liability for failing to act “reasonably” or remove user content they “should have known” was harmful.
Whether directly mandated or strongly incentivized, general monitoring is bad for human rights and for users.
- As the scale of online content is so vast, general monitoring commonly uses automated decision-making tools that reflect the dataset’s biases and lead to harmful profiling.
- These automated upload filters are prone to error, are notoriously inaccurate, and tend to overblock legally protected expressions.
- Upload filters also contravene the foundational human rights principles of proportionality and necessity by subjecting users to automated and often arbitrary decision-making.
- The active observation of all files uploaded by users has a chilling effect on freedom of speech and access to information by limiting the content users can post and engage with online.
- A platform reviewing every user post also undermines users privacy rights by providing companies, and thus potentially government agencies, with abundant data about users. This is particularly threatening to anonymous speakers.
- Pre-screening can lead to enforcement overreach, fishing expeditions (undue evidence exploration), and data retention.
- General monitoring undermines the freedom to conduct business, adds compliance costs, and undermines alternative platform governance models.
- Monitoring technologies are even less effective at small platforms, which don’t have the resources to develop sophisticated filter tools. General monitoring thus cements the gatekeeper role of a few power platforms and further marginalizes alternative platform governance models.
We have previously expressed concern about governments employing more aggressive and heavy-handed approaches to intermediary regulation, with policymakers across the globe calling on platforms to remove allegedly legal but ‘undesirable’ or ‘harmful’ content from their sites, while also expecting platforms to detect and remove illegal content. In doing so, states fail to protect fundamental freedom of expression rights and fall short of their obligations to ensure a free online environment with no undue restrictions on legal content, whilst also restricting the rights of users to share and receive impartial and unfiltered information. This has a chilling effect on the individual right to free speech wherein users change their behavior and abstain from communicating freely if they know they are being actively observed—leading to a pernicious culture of self-censorship.
In one of the more recent policy developments on intermediary liability, the European Union recently approved the Digital Services Act (DSA). The DSA rejects takedown deadlines that would have suppressed legal, valuable, and benign speech. EFF helped to ensure that the final language steered clear of intrusive filter obligations. By contrast, the draft UK Online Safety Bill raises serious concerns around freedom of expression by imposing a duty of care on online platforms to tackle illegal and otherwise harmful content and to minimize the presence of certain content types. Intrusive scanning of user content will be unavoidable if this bill becomes law.
So how do we protect user rights to privacy and free speech whilst also ensuring illegal content can be detected and removed? EFF and other NGOs have developed the Manila Principles which emphasize that intermediaries shouldn’t be held liable for user speech unless the content in question has been fully adjudicated as illegal and a court has validly ordered its removal. It should be up to independent, impartial, and autonomous judicial authorities to determine that the material at issue is unlawful. Elevating courts to adjudicate content removal means liability is no longer based on the inaccurate and heavy-handed decisions of platforms. This would also ensure that takedown orders are limited to the specific piece of illegal content as decided by courts or similar authority.
EFF has also previously urged that regulators ensure online intermediaries continue to benefit from exemptions on liability for third-party content, and any additional obligations must not curtail free expression and consumer innovation. To restrict content, these rules must be provided by laws; be precise, clear, and accessible; and must follow due process and respect the principle that independent judicial authorities should assess content and decide on its restriction. Decisively, intermediaries should not be held liable if they choose not to remove content based on a mere notification by users.
Regulators must take more effective voluntary actions against harmful content and adopt moderation frameworks that are consistent with human rights to make the internet free and limit the power of government agencies in flagging and removing potentially illegal content.
The Electronic Frontier Foundation is the leading nonprofit organization defending civil liberties in the digital world. Founded in 1990, EFF champions user privacy, free expression, and innovation through impact litigation, policy analysis, grassroots activism, and technology development. We work to ensure that rights and freedoms are enhanced and protected as our use of technology grows. Visit https://www.eff.org