EFF Responds to EU Commission on the Digital Services Act: Put Users Back in Control

Fight Censorship, Share This Post!

The European Union is currently preparing for a  significant overhaul of its core platform regulation, the e-Commerce Directive. Earlier this year the European Commission, the EU’s executive, pledged to reshape Europe’s digital future and to propose an entire package of new rules, the Digital Services Act (DSA). The package is supposed to address the legal responsibilities of platforms regarding user content and include measures to keep users safe online. The Commission also announced a new standard for large platforms that act as gatekeepers in an attempt to create a fairer, and more competitive, market for online platforms in the EU.

Preserve What Works

While the European Commission has not yet published its proposal for the DSA, the current preparatory phase is an important opportunity to expose the Commission to diverse insights on the complex issues the DSA will cover. Alongside our European partners, we have therefore contributed to the Commission’s consultation that will feed into the assessment of the different regulatory options available. In our response, we  remind the Commission of some of the aspects of the e-Commerce Directive that have been crucial for the growth of the online economy, and the protection of fundamental rights in the EU: it is essential to retain the Directive’s approach of limiting platforms’ liability over user content and banning Member States from imposing obligations to track and monitor users’ content.

Fix What Is Broken

But the DSA should not only preserve what was good about the old Directive. It is also a chance to boldly imagine a version of the Internet where users have a right to remain anonymous, enjoy substantial procedural rights in the context of content moderation, and can have more control over how they interact with content. That should include measures to make the use of algorithms more transparent, but must also allow people to choose for themselves whether they want algorithms to curate their feeds at all. Beyond giving users the rights and options they deserve, it is time to re-think the Internet more fundamentally. That’s why we propose interoperability obligations for large platforms. Flanked by strong privacy and security safeguards, a European commitment to interoperability could empower users to shape their online environments according to their needs and preferences, will allow people to connect with each other beyond the walled gardens of the largest platforms, and will reinvigorate the digital economy.

Have Your Say Too

There is still time to respond to the consultation until 8 September, and we invite you to join us in our call for an open and safe Internet that empowers users. You can submit your comments to the European Commission’s public consultation here.

Our main demands regarding interoperability are:

  1. Platforms with significant market power must offer non-discriminatory possibilities for competing, not-incumbent platforms to interoperate with their key features;
  2. Platforms with significant market power should make it possible for competing third parties to act on users’ behalf. If users want to, they should be able to delegate elements of their online experience to different competent actors;
  3. Interoperability measures must respect key privacy principles such as data minimization, privacy by design, and privacy by default;
  4. If intermediaries do have to suspend interoperability to fix security issues, they should not exploit such situations to break interoperability but rather communicate transparently, resolve the problem, and reinstate interoperability interfaces within a reasonable and clearly defined timeframe.

Our main demands regarding platform liability are:

  1. Online intermediaries should not be held liable for user content and should continue to benefit from the comprehensive liability exemptions contained in the e-Commerce Directive;
  2. It should be clarified that actual knowledge of illegality is only obtained by intermediaries if they are presented with a court order;
  3. The Member States of the EU should not be permitted to impose obligations on digital service providers to affirmatively monitor their platforms or networks for illegal content that users post, transmit, or store. The ban on general monitoring obligations should include a ban on mandated automated filter systems.
  4. The Internet is global and takedown orders of global reach are immensely unjust and impair users’ freedom. New rules should make sure that court orders—and particularly injunctions—should not be used to superimpose the laws of one country on every other state in the world.

Our main demands regarding user controls are:

  1. Users of social media platforms with significant market power should be empowered to choose content they want to interact with in a simple and user-friendly manner, and should have the option to decide against algorithmically-curated recommendations altogether;
  2. Online platforms should provide meaningful information about the algorithmic tools they use in content moderation and content curation. Users need easily accessible explanations to understand when, for which tasks, and to which extent algorithmic tools are used. Online platforms should also allow independent researchers and relevant regulators to audit their algorithmic tools to make sure they are used as intended;
  3. Users should be notified whenever the rules that govern them change, must be asked for their consent and should be informed of the consequences of their choice. They should also be provided with a meaningful explanation of any substantial changes in a language they understand;
  4. The Digital Services Act should affirm users’ informational self-determination and introduce the European right to anonymity online.

Our main demands for procedural justice are:

  1. The EU should adopt harmonized rules on reporting mechanisms that ensure that reporting potentially illegal content is easy, and any follow-up actions by the platform is transparent for its users;
  2. Platforms should provide users with a notice when content has been removed that identifies the content removed, the specific rule that it was found to violate, and how the content was detected. It should also offer an easily accessible explanation of the process through which the user can appeal the decision;
  3. If platforms use automated decision making to restrict content, they should flag at which step of the process algorithmic tools were used, explain the logic behind the automated decisions taken, and also explain how users can contest the decision;
  4. The Digital Services Act should promote quick and easy reinstatement of wrongfully removed content or wrongly disabled accounts.

Fight Censorship, Share This Post!

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.