The Ultimate Managed Hosting Platform

The EU’s Copyright Directive Is Still About Filters, But EU’s Top Court Limits Its Use

Fight Censorship, Share This Post!

The Court of Justice of the European Union has issued a long-awaited judgment on the compatibility of the EU Copyright Directive’s filtering requirements with the Charter of Fundamental Rights of the European Union. The ruling recognizes the tension between copyright filters and the right to freedom of expression, but falls short of banning upload filters altogether.

Under Article 17 of the EU’s controversial Copyright Directive, large tech companies must ensure that infringing content is not available on their platforms or they could be held liable for it. Given that legal risk, platforms will inevitably rely on error-prone upload filters that undermine lawful online speech – as Poland pointed out in the legal challenge that led to the judgment.

No Alternatives to Filtering Tools, But Strong User Safeguards

The Court acknowledged that Article 17’s obligation to review content constitutes a de facto requirement to use automatic recognition and filtering tools, and held that such mechanisms would indeed constitute an interference with users’ freedom of expression rights. However, as with last year’s opinion of the Court of Justice’s Advocate General, the judges concluded that the safeguards provided by Article 17 were adequate. Because those safeguards include an obligation to ensure the availability of lawful uploads, an automated system that cannot “distinguish adequately between unlawful content and lawful content” won’t pass muster under EU law.

The Court also highlighted the responsibility of rightsholders to provide platforms with undoubtedly relevant and necessary information” of an unlawful use of copyrighted material. Platform providers cannot be forced to “generally monitor” user content to check the legality of content; that also means that they cannot be required to conduct an “independent assessment” of the content. If a platform ends up removing lawful content, users can invoke the Directive’s “complaint and redress” mechanisms.

To Block or Not to Block

The court’s focus on interpreting exceptions and limitations to copyright in a way that preserves fundamental rights is laudable and follows the EFF’s own suggestions. Following the court’s criteria, platforms can argue that they are only required to use upload filters in obvious cases. That, in turn, could trigger a requirement for several EU Member States to go rework their implementations of the EU Copyright Directive (which ignore the fundamental rights perspective). The ruling means that national governments must pay much stronger attention to user rights.

However, the Court failed to set out parameters to help platforms decide when and when not to block content. Worse, it side-stepped the core issue – whether automated tools can ever be reasonably implemented. It’s hard to see how the measures implied by this ruling can actually ensure that speech-intrusive measures are  “strictly targeted.” In the ruling, the Court explained the limits of content monitoring by referring to the Glawischnig-Piesczek v Facebook case, a speech-intrusive case involving the removal of defamatory content. But that reference doesn’t tell us much: the Court in Glawischnig-Piesczek v Facebook ignored the state of the art and real-world operations of “automated search tools and technologies tools” and underestimated how screening efforts by platforms could easily become excessive, undermining users’ fundamental rights. 


Fight Censorship, Share This Post!

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.