Once Again, Facebook Is Using Privacy As A Sword To Kill Independent Innovation

Fight Censorship, Share This Post!

Facebook claims that their role as guardian of users’ privacy gives them the power to shut down apps that give users more control over their own social media experience. Facebook is wrong. The latest example is their legal bullying of Friendly Social Browser.

Friendly is a web browser with plugins geared towards Facebook, Instagram, and other social media sites. It’s been around since 2010 and has a passionate following. Friendly offers ad and tracker blocking and simplifies downloading of photos and videos. It lets users search their news feeds by keyword, or reorder their feeds chronologically, and it displays Facebook pages with alternative “skins.”

To Facebook’s servers, Friendly is just a browser like any other. Users run Friendly much as they would Google Chrome, Mozilla Firefox, or any other standard web browser. According to Friendly, its software doesn’t call any developer interfaces (APIs) into Facebook or Instagram. Friendly has also stated that they don’t collect any personal information about users, including posts or uploads. Friendly does collect some anonymous usage data, and sends the ads that people view to a third-party analytics firm.

Over the summer, Facebook’s outside counsel demanded that Friendly stop offering its browser. Facebook’s lawyer claimed that Friendly violated Facebook’s terms of service by “chang[ing] the way Facebook and Instagram look and function” and “impairing [their] intended operation.” She claimed, incorrectly, that violating Facebook’s terms of service was also a violation of the federal Computer Fraud and Abuse Act (CFAA) and its California counterpart.

Although Friendly explained to Facebook’s lawyers that their browser didn’t access any Facebook developer APIs, Facebook hasn’t budged from its demand that Friendly drop dead. 

Today, EFF sent Facebook a letter challenging Facebook’s legal claims. We explained that the CFAA and its California counterpart are concerned with “access” to a protected computer:

California law defines “access” as “to gain entry to, instruct, cause input to, cause output from, cause data processing with, or communicate with” a computer. Friendly is a web browser, so it is our understanding that Friendly does not itself “gain entry to” or “communicate with” Facebook in any way. Like other popular browsers such as Google Chrome or Mozilla Firefox, therefore, Friendly does not “access” Facebook; Facebook users do. But presumably Facebook knows better than to directly accuse its users of being malicious hackers if they change the colors of websites they view.

While EFF is not representing Friendly at this time, we weighed in because Facebook’s claims are dangerous. Facebook is claiming the power to decide which browsers its users can use to access its social media sites, an extremely broad claim. According to the reasoning of Facebook’s demand, accessibility software like screen readers, magnifiers, and tools that change fonts or colors to make pages more readable for visually impaired people all exist by Facebook’s good will, and could be shut down anytime if Facebook decides they “change the way Facebook and Instagram look and function.”

Friendly is far from the only victim of the company’s strong-arming. Just last month, Facebook threatened the NYU Ad Observatory, a research project that recruits Facebook users to install a plugin to collect the ads they’re shown. And in 2016, Facebook convinced a federal court of appeals that the CFAA barred a third-party social media aggregator from interacting with user accounts, even when those users chose to sign up for the aggregator’s service. In sum, Facebook’s playbook—using the CFAA to enforce spurious privacy claims—has made it harder for innovators, security experts, and researchers of all stripes to use Facebook in their work. 

Facebook has claimed that it must bring its legal guns to bear on any software that interoperates with Facebook or Instagram without permission, citing to the commitments that Facebook made to the Federal Trade Commission after the Cambridge Analytica scandal. But there are different kinds of privacy threats. Facebook’s understandable desire to protect users (and its own reputation) against privacy abuses by third parties like Cambridge Analytica doesn’t take away users’ right to guard themselves against Facebook’s own collection and mishandling of their personal data by employing ad- and tracker-blocking software like Friendly (or EFF’s Privacy Badger, for that matter). 

Nor do Facebook’s privacy responsibilities justify stopping users from changing the way they experience Facebook, and choosing tools to help them do that. Attempts to lock out third-party innovators are not a good look for a company facing antitrust investigations, including a pending lawsuit from the Federal Trade Commission.

The web isn’t television. Website owners might want to control every detail about how their sites look and function, but since the very beginning, users have always been in control of their own experience—it’s one of the defining features of the Web. Users can choose to re-arrange the content they receive from websites, save it, send it along to others, or ignore some of it by blocking advertisements and tracking devices. The law can’t stop users from choosing how to receive Facebook content, and Facebook shouldn’t be trying to lock out competition under a guise of protecting privacy.


Fight Censorship, Share This Post!

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.