Robots Have No Place Filtering Creative Content, EFF Tells U.S. Copyright Office

Fight Censorship, Share This Post!

Software robots should not be deciding whether your creative content, whether written words, videos, photos, or music, ought to be pulled off the internet.

That’s what we told the U.S. Copyright office in comments we filed February 8 arguing against requiring service providers to embrace “standard technical measures” to address copyright infringement. While some technologies can be useful to flag potential infringement, technical measures such as automated filters are dangerous because they cannot reliably sort lawful from unlawful expression. They can’t tell if someone has a license to use a copyrighted work, they can’t recognize if someone is making fair use of a work or tell the difference between two versions of the same work, or know that targeted material is in the public domain.

The Copyright Office is collecting information on the use of standard technical measures ahead of a series of consultations with the public and “industry sectors,” including a February 22 plenary session. Any such consultations should start by understanding the problems with many of the technical measures that already exist—and filters are a prime example. You can find our comments here.

Examples of egregiously erroneous flags abound; our filing offers just a few, including a 12-second loop of a cat purring that was mistakenly matched by YouTube’s Content ID to content owned by EMI Publishing and PRS; blocking decisions targeting classical music because Facebook’s filters can’t tell the difference between two different performances of the same public domain work; the automated blocking of a NASA video of the Curiosity landing on Mars.

These system failures have real economic as well as expressive consequences: internet creators may lose their channels, and people trying to use social media to promote their businesses will be cut off from a crucial venue.  For example, an independent club owner in San Francisco tried to run a promotional video given to him for a band he’d booked, only to find Instagram’s filer blocked it. He was able to reinstate the video, 17 months after the band had come and gone.

The Copyright Office needs to hear the real problems with technical approaches to infringement, especially automated filters, so that it understands the dangers of allowing robots to shape online expression. As we also point out, it needs to consider how any requirement to implement filtering, or another technical measure could distort the market for internet services by giving an advantage to service providers with sufficient resources to develop and/or implement costly filtering systems. This stifles innovation, robs internet creators of new platforms on which to show their work, and ensures that the current tech giants will stay comfortably entrenched.

The Copyright Office must also consider how any mandated technical measures could affect internet security, particularly if it is imposed on providers of internet infrastructure and basic access such as an internet service provider (ISP). Indeed, given all of these potential consequences, the Copyright Office should consider whether it is the right agency to be convening this consultation in the first place.

The internet is essential to education, family, employment, politics, civics, charity, romance, and so much more (including entertainment). Any mandated standard technical measures will necessarily affect these activities. The desire of some copyright holders for more technical measures to fight infringement must be balanced against the interests of a wide and diverse range of public stakeholders whose internet experiences and services will be affected and whose rights to innovate and create are protected by copyright laws


Fight Censorship, Share This Post!

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.