During the Article 17 (formerly #Article13) discussions about the availability of copyright-protected works online, we fought hand-in-hand with European civil society to avoid all communications being subjected to interception and arbitrary censorship by automated upload filters. However, by turning tech companies and online services operators into copyright police, the final version of the EU Copyright Directive failed to live up to the expectations of millions of affected users who fought for an Internet in which their speech is not automatically scanned, filtered, weighed, and measured.
Our Watch Has Not Ended
EU “Directives” are not automatically applicable. EU member states must “transpose” the directives into national law. The Copyright Directive includes some safeguards to prevent the restriction of fundamental free expression rights, ultimately requiring national governments to balance the rights of users and copyright holders alike. At the EU level, the Commission has launched a Stakeholder Dialogue to support the drafting of guidelines for the application of Article 17, which must be implemented in national laws by June 7, 2021. EFF and other digital rights organizations have a seat at the table, alongside rightsholders from the music and film industries and representatives of big tech companies like Google and Facebook.
During the stakeholder meetings, we made a strong case for preserving users’ rights to free speech, making suggestions for averting a race among service providers to over-block user content. We also asked the EU Commission to share the draft guidelines with rights organizations and the public, and allow both to comment on and suggest improvements to ensure that they comply with European Union civil and human rights requirements.
The Targeted Consultation: Don’t Experiment With User Rights
The Commission has partly complied with EFF and its partners’ request for transparency and participation. The Commission launched a targeted consultation addressed to members of the EU Stakeholder Group on Article 17. Our response focuses on mitigating the dangerous consequences of the Article 17 experiment by focusing on user rights, specifically free speech, and by limiting the use of automated filtering, which is notoriously inaccurate.
Our main recommendations are:
- Produce a non-exhaustive list of service providers that are excluded from the obligations under the Directive. Service providers not listed might not fall under the Directive’s rules, and would have to be evaluated on a case-by-case basis;
- Ensure that the platforms’ obligation to show best efforts to obtain rightsholders’ authorization and ensure infringing content is not available is a mere due diligence duty and must be interpreted in light of the principles of proportionality and user rights exceptions;
- Recommend that Member States not mandate the use of technology or impose any specific technological solutions on service providers in order to demonstrate “best efforts”;
- Establish a requirement to avoid general user (content) monitoring. Spell out that the implementation of Art 17 should never lead to the adoption of upload filters and hence general monitoring of user content;
- State that the mere fact that content recognition technology is used by some companies does not mean that it must be used to comply with Art 17. Quite the opposite is true: automated technologies to detect and remove content based on rightsholders’ information may not be in line with the balance sought by Article 17.
- Safeguard the diversity of platforms and not put disproportionate burden on smaller companies, which play an important role in the EU tech ecosystem;
- Establish that content recognition technology cannot assess whether the uploaded content is infringing or covered by a legitimate use. Filter technology may serve as assistants, but can never replace a (legal) review by a qualified human;
- Filter-technology can also not assess whether user content is likely infringing copyright;
- If you believe that filters work, prove it. The Guidance should contain a recommendation to create and maintain test suites if member states decide to establish copyright filters. These suites should evaluate the filters’ ability to correctly identify both infringing materials and non-infringing uses. Filters should not be approved for use unless they can meet this challenge;
- Complaint and redress procedures are not enough. Fundamental rights must be protected from the start and not only after content has been taken down;
- The Guidance should address the very problematic relationship between the use of automated filter technologies and privacy rights, in particular the right not to be subject to a decision based solely on automated processing under the GDPR.
The Electronic Frontier Foundation is the leading nonprofit organization defending civil liberties in the digital world. Founded in 1990, EFF champions user privacy, free expression, and innovation through impact litigation, policy analysis, grassroots activism, and technology development. We work to ensure that rights and freedoms are enhanced and protected as our use of technology grows. Visit https://www.eff.org