Podcast Episode: A Better Future Starts with Secret Codes

Fight Censorship, Share This Post!

Podcast Episode 105

Law enforcement wants to force companies to build a backdoor to the software that runs on your phones, tablets, and other devices. This would allow easier access to the information on your device and the information that flows through it, including your private communications with others, the websites you visit, and all the information from your applications. Join EFF’s Cindy Cohn and Danny O’Brien as they talk to Riana Pfefferkorn, a lawyer and research scholar at the Stanford Internet Observatory, about the dangers of law enforcement trying to get these backdoors built and how users’ lives are better without them.

Click below to listen to the episode now, or choose your podcast player:

play

%3Ciframe%20height%3D%22200px%22%20width%3D%22100%25%22%20frameborder%3D%22no%22%20scrolling%3D%22no%22%20seamless%3D%22%22%20src%3D%22https%3A%2F%2Fplayer.simplecast.com%2F0de5ddf8-99a0-4e2c-9855-4cff1fffb1c7%3Fdark%3Dtrue%26amp%3Bcolor%3D000000%22%20allow%3D%22autoplay%22%3E%3C%2Fiframe%3E

Privacy info.
This embed will serve content from simplecast.com

Listen on Google Podcasts badge  Listen on Apple Podcasts Badge
Listen on Spotify Podcasts Badge  Subscribe via RSS badge

More than ever before, users—from everyday people to CEOs to even high-ranking government officials—have troves of personal and work-related information on their devices. With so much data stored by such a wide variety of users, including government officials, why would law enforcement want to create a vulnerability in the devices’ software?

Riana Pfefferkorn guides us toward an internet that prioritizes users over the state and how that would lead to individuals having the ability to express themselves openly and have safe, private conversations. 

Not only could bugs get in through that hole, but it also might spider cracks out throughout the rest of the windshield.

In this episode you’ll learn about:

  • Different types of data law enforcement try to gather information from, including “at rest” and “in transit” data.
  • The divide between law enforcement, national security and intelligence communities regarding their stance on strong encryption and backdoors on devices.
  • How the First Amendment plays a role in cryptography and the ability for law enforcement to try to force companies to build certain code into their software.
  • How strong encryption and device security empowers users to voice their thoughts freely.

Riana Pfefferkorn is a Research Scholar at the Stanford Internet Observatory. She focuses on investigating and analyzing the U.S. and other governments’ policies and practices for forcing decryption and/or influencing crypto-related design of online platforms and services via technical means and through courts and legislatures. Riana also researches the benefits and detriments of strong encryption on free expression, political engagement, and more. You can find Riana Pfefferkorn on Twitter @Riana_Crypto.

If you have any feedback on this episode, please email [email protected]. You can find a copy of this episode on the Internet Archive. 

Below, you’ll find legal resources—including links to important cases, books, and briefs discussed in the podcast—as well as a full transcript of the audio.

Resources 

Encryption and Exceptional Access:

Apple and the FBI:

Code as First Amendment Speech

Transcript:

Riana Pfefferkorn:
The term backdoor is one that the government doesn’t like to use. Sometimes they just want to call it the front door, to just walk right on in here, your encrypted communications or your devices. But, nevertheless, they tend to prefer phrases like an exceptional access mechanism.
The problem being that when you are building an exceptional access mechanism, it’s a hole.
And so we have likened it to drilling a hole in a windshield, where the windshield is supposed to protect you, but now you have a hole that’s been drilled in the middle of it. Not only could bugs get in through that hole, but also it might spider cracks out to throughout the rest of the windshield.

Danny O’Brien:
That’s Riana Pfefferkorn. She’s a research scholar at the Stanford Internet Observatory and she’s also a lawyer. We’re going to talk to her today about why backdoors into our devices are a bad idea.

Cindy Cohn:
And we’re also going to talk about a future in which we have privacy while also giving the police the tools they do need and not the ones that they don’t. Welcome to EFF’s How to Fix the Internet.

Cindy Cohn:
Welcome to the show. I’m Cindy Cohn, EFS’s executive director.

Danny O’Brien:
And I’m Danny O’Brien, and I’m a special advisor to EFF.

Cindy Cohn:
Today we’re going to dig into device encryption and backdoors.

Danny O’Brien:
Riana’s been researching forced decryption and the influence of the US government and law enforcement have had on technology and platform design. She’ll take us through what is at stake, how we can approach the problem, and what is standing in the way of the solutions. Riana, thanks for joining us.

Riana Pfefferkorn:
Thank you for having me today.

Cindy Cohn::
We’re so excited to have you here. Riana. Of course, as you know, talking about encryption is near and dear to all of our hearts here at EFF. We think most people first recognize the idea that the FBI was seeking a backdoor into their devices and information in 2015, when it demanded that Apple build one into the iPhone, after a horrible incident in San Bernardino. Now Apple pushed back with help from a lot of us, both you and the EFF, and the government ended up getting the information another way and the case was dropped. Bring us up to speed, what’s happened since then?

Riana Pfefferkorn::
Following the Apple versus FBI dispute in San Bernardino, we saw the almost introduction of a bill by our very own here in California, Senator Dianne Feinstein, together with Senator Richard Burr, that would have imposed penalties on smartphone manufacturers that did not find a way to comply with court orders by unlocking phones for law enforcement.
That was in 2016. That bill was so roundly ridiculed that it never actually even got formally introduced in any committees or anything, much less went anywhere further beyond that. Then in the next few years, as law enforcement started being able to with, fair regularity, get into devices the way they had done in the San Bernardino dispute, we saw the debate shift, at least in the United States, from a focus on device encryption to a focus on end-to-end encryption for our communications, for our messages, particularly in end-to-end encrypted chat apps.

Cindy Cohn:
I also remember that there was another incident in Pensacola, Florida a few years ago, where the FBI once again tried to push Apple into it. Once again, the FBI was able to get the information without Apple having to hurt the security of the phones. So it seems to me that the FBI can get into our devices otherwise. So why do they keep pushing?

Riana Pfefferkorn:
It used to be that the rationale was encryption is wholly precluding us from getting access to evidence. But as it’s become more and more obvious that they can open phones, as in the Pensacola shooting, as in the San Bernardino shooting, the way they speak about it has changed slightly to, “Well, we can’t get into these phones quickly enough, as quickly as we would like.”
Therefore, it seems that now the idea is that it is an impediment to the expeditiousness of an investigation rather than to being able to do the investigation at all. And so, if there were guaranteed access by just being able to make sure that, by design, our devices were made to provide a ready-made backdoor for governments, then they wouldn’t have to go through all of the pesky work of either having to use their own in-house tools in order to crack into phones, as the FBI has done with its own in-house capabilities, or purchase them, or seek out a vendor that has the capability of building exploits to allow them to get into those phones, which is what happened in the San Bernardino situation.

Danny O’Brien:
So I think this leaves people generally confused as to what data is protected and from whom on their phones. Now I think you’ve talked about two things here. One is the protecting data that’s on the phone, people’s contacts, stuff like that. Then there’s the content of communications where you have this end-to-end encryption. But what is the government able to access? Who is the encryption supposed to protect people against? What’s the current state of play?

Riana Pfefferkorn: We can think about whether data is at rest or if it’s in transit. So when the government seeks to get access to messages as they are live passing over the wire, over the airwaves between two people, that requires them to get a wiretap order that lets them see basically in real time the contents of communications that are going back and forth.
Once those have been received, once they are in the string of chat messages that I have in my chat app on my phone, or other messages or information that you might have, locally on your device, or remotely also in the cloud, we could talk about that, then that’s a situation where there’s a different mechanism that law enforcement would have to get.
They would need a warrant in order to get into, be able to search it and seize data off of your phone. So we’re looking at two different points in time, potentially, for what might be the same conversation.
In terms of accessibility, I think if your device is encrypted, then that impedes law enforcement from rapidly being able to get into your phone. But once they do, using the third-party tools or homeworld tools that they have for getting into your phone, then they can see any text messages, conversations that you’ve got, unless you have disappearing messages turned on in the apps that you use, in which case they will have vanished from your particular device, your end point.
Whereas if law enforcement wants to get access to end-to-end encrypted communications as they’re in transit on the wire, they’re not going to be able to get anything other than gobbledygook, where they have to compel the provider to wiretap those conversations for them. And so, we’ve also seen some scattered efforts by US law enforcement to try and force the providers of end-to-end encrypted apps to remove or weaken that in order to enable wiretapping.

Danny O’Brien:
So the data that’s stored on the phone, so this is the data that’s encrypted and at rest, the idea behind devices and companies like Apple encrypting that is just a general protection. So if someone steals my phone, they don’t get what I have, right?

Riana Pfefferkorn:
Yeah. It’s funny, we used to see from the same heads of police agencies who subsequently got angry at Apple for having stronger encryption, they used to be mad about the rate at which phones were getting stolen. It wasn’t so much that criminals wanted to steal a several hundred-dollar hunk of metal and glass. It was what they could get into by being able to easily get into your phone before that prevalence of strong default passcodes and stronger encryption to get into phones.
There was a treasure trove of information that you could get. Once you were in somebody’s phone, you could get access to their email, you can get access to anything else that they were logged into, or have ways of resetting their logins and get into those services, all because you’d been able to steal their phone or their iPad or what have you.
And so, the change to making it harder to unlock phones wasn’t undertaken by Apple, or subsequently by Google for Android phones, in order to stick it to law enforcement. It was to cut down on that particular angle of attack for security and privacy invasions that criminal rings or hackers or even abusive spouses or family members might be able to undermine your own interests in the thing that has been called basically an appendage of the human body by none other than our own Supreme Court.

Cindy Cohn:
in some ways it’s the cops versus the cops on this, because the cops that are interested in helping protect us from crime in the first place want us to have our doors locked, want us to have set this lock down so that we’re protected if somebody comes and steals from us. By the way, that’s how most people feel as well.
Whereas the part of the cops that want to solve crimes, want to make it as easy as possible for them to get access to information. And so, in some ways, it’s cop versus cop about this. If you’re asking me, I want to side with the cop who wants to make sure I don’t get robbed in the first place. So I think it’s a funny conversation to be in.

Riana Pfefferkorn:
But it’s exactly as you say, Cindy, that there are several different components of the government whose interests are frequently at odds when it comes to issues of security and privacy, in as much as not only is there a divide between law enforcement and the national security and intelligence communities when it comes to encryption, where the folks who come out of working at the NSA then turn around and say, “We try and push for stronger encryption because we know that one part of our job in the intelligence community is to try and ensure the protection of vital information and state secrets and economic protection and so forth,” as opposed to law enforcement who have been the more vocal component of government in trying to push for undermining or breaking encryption.
Not only is there this divide between national security and the intelligence community and law enforcement, there’s also a divide between law enforcement and consumer protection agencies, because I think that we find a lot of providers that have sensitive information and incentives to protect it by using strong encryption are in a bind, where on the one hand, they have law enforcement saying, “You need to make it easier for us to investigate people and to conduct surveillance,” and on the other hand, they have state attorneys general, they have the Federal Trade Commission, and other authorities breathing down their necks saying, “You need to be using stronger encryption. You need to be taking other security measures in order to protect your customers’ data and their customers’ data.”

Danny O’Brien:
So the argument seems to be from law enforcement, “Well, okay, stop here. No further. We don’t want this to get any better protected.” What are the arguments on the other side? What are the arguments for not only keeping the protections that we have already, but not stopping and continuing to make this stuff safer and more secure?

Riana Pfefferkorn:
There are a variety of different pain points. We can look at national security interests. There’s the concept of commercial off-the-shelf software and hardware products where people in the military or people in government are using the same apps and the same phones that you or I use, sometimes with additional modifications, to try and make them more secure. But to the extent that everybody is using the same devices, and that includes CEOs and the heads of financial institutions and a high-ranking government officials, then we want those devices to be as secure just off the line as they could be given that variety of use cases.
That’s not to say that average people like you or me, that our interests aren’t important as well, as we continue to face growing ransomware pandemic and other cybersecurity and data breach and hacking incidents that seem to dominate the headlines.
Encryption isn’t necessarily a cure all for all of those ills, but nevertheless, to the greater degree that we can encrypt more data in more places, that makes it more difficult for attackers to get anything useful in the event that they are able to access information, whether that’s on a device or whether that’s on a server somewhere.
All of this, of course, has been exacerbated by the COVID-19 pandemic. Now that all of us are at home, we’re doing things over electronic mediums that we previously did face-to-face. We deserve just as much privacy and we deserve just as much security as we ever had when we were having those communications and meetings and doctor appointments and therapist appointments face-to-face.
And so, it’s important, I think, to continue expanding security protections, including encryption, in order to maintain those expectations that we had now that so much more of what we do for the past 18 months has had to be online in order to protect our own physical health and safety.

Cindy Cohn:
Your answer points out that a focus on the just you and you have nothing to hide misses that, on one hand, we’re not all the same. We have very different threats. Some of us are journalists. Some of us are dissidents. Some of us are people who are facing partner abuse.

One way or another, we all have a need to be secure and to be able to have a private conversation these days.

Riana Pfefferkorn:
One of the areas where I think we frequently undermine their interest is children and the privacy and speech interests of children, and the idea that children somehow deserve less privacy. We have restrictions.
Parents have a lot of control over their children’s lives. But children are tomorrow’s adults. And so, I think there’s also been a lot of concern about not normalizing surveillance of children, whether that’s doing school from home over the laptop, contexts again that we’ve had over the last 18 months of surveillance of children who are trying to do schoolwork or attend classes online.
There has been some concern expressed, for example, by Susan Landau, who’s a computer science professor at Tufts, saying we need to not give children the impression that when they become tomorrow’s adults, that we normalize surveillance and intrusion upon their own ability to grow and have private thoughts and become who they’re going to become, and grow up into a world where they just think that extremely intrusive level of surveillance
is normal or desirable in society.

Danny O’Brien::
“How to Fix the Internet” is supported by The Alfred P. Sloan Foundation’s Program in Public Understanding of Science, enriching people’s lives through a keener appreciation of our increasingly technological world and portraying the complex humanity of scientists, engineers, and mathematicians.

Cindy Cohn::
As long as we’re talking encryption, I want to talk about the First Amendment, because, of course, that is the tool that we used in the 1990s to free up encryption from government regulation, with the observation that cryptography is just applied math and you can’t regulate math without meeting the First Amendment tests. Is the First Amendment playing a role in the conversation today, or how do you think it’ll play as this conversation goes forward?

Riana Pfefferkorn::
It’s a great question because we have multiple court cases now that recognize that code is speech and is protected speech. To the degree that this is going to come into play in government efforts to either pass a law restricting the ability of providers, of devices, or communication services to offer strong encryption to their users, or to the degree it comes up in court orders to try and undermine existing strong protections, I think there are First Amendment arguments to be made.
When we were doing the front of the court briefing in the Apple versus FBI case, that came up multiple times, including in Apple’s own briefing, to say, look, we have a First Amendment right not to be forced to hold out a suborn piece of software that the government’s trying to compel us to write that would roll back security protections for this particular phone.
We have a First Amendment right not to be forced to stand by something that we don’t actually stand by. We don’t want to stand by that piece of software. We don’t want to have to pretend like we do.
That’s a little bit of a nuanced point, I think, to make. I think often when we talk about the First Amendment in the context of the internet, and this goes into current debates around content moderation as well, it becomes really easy to forget that we’re not necessarily only talking about the First Amendment rights of end users as people who want to speak and receive information, receiving information also being a First Amendment protected right, it’s also the rights of the companies and organizations that build these tools to be able to write and disseminate code as they wish and not to have to be forced into cryptographically signing a piece of code that they don’t stand by, such as custom iOS software.

Cindy Cohn::
When you saw what Apple was saying here, is, “Look, we’re selling a secure tool. We are telling people this is a secure tool,” and you’re basically making us into liars.
It’s one thing to think about speech rights in the abstract about corporations, but I think the government forcing a company to lie about the security of its product does, I think, even if you’re not a fan of corporations, feel like maybe that’s something that the government shouldn’t be able to do. It shouldn’t be able to force me to lie and it shouldn’t be able to force Apple to lie

Danny O’Brien:
So one of the things that fascinates me about the idea of compelling backdoors into the software produced by companies like Facebook, with WhatsApp and so forth, is what happens next? We’ve seen this a little bit in the rest of the world, because the UK and Australia have effectively introduced laws that are like this. But then they’ve had to hold off on actually introducing those back doors because the pushback from the companies and from the public has been so huge.
So I guess what I’m asking here is in the nightmare scenario where somebody does introduce this stuff, what happens? Do people suddenly … Everybody around the world, everybody writes software that has backdoors in it? Does it really solve the problem that they’re trying to solve here?

Riana Pfefferkorn:
It really does feel like a time machine in some ways, that we would end up maybe going right back to the 1990s when there were two different versions of the Netscape browser. One was domestic-grade crypto and one was export-grade crypto. Do we end up with a world where services have different versions with weaker or stronger encryption, depending on what market they’re being offered in?

It seems like we’re in a place right now where if you regulate and mandate weaker encryption at one level, then the encryption moves to a different level. So, for example, WhatsApp just announced that they are allowing opt-in to end-to-end encrypt your messaging backups. If you don’t trust your cloud service provider not to be somehow scanning for verboten or disfavored content, then you could choose to turn on end-to-end encryption for your WhatsApp backups.

Or will people be scared of having them at all? I think one of the nightmare scenarios is that we go from this place where people finally have means of communicating openly and honestly and sincerely with each other, secure in the knowledge that their communications have protection, thanks to end-to-end encryption, or that they are able to encrypt their devices in such a way that they can’t easily be accessed by others. Instead they get chilled into not saying what they want to say or fully realizing and self-actualizing themselves in the way that they would, which is the grim 1984 version.
But we’ve seen something of that in terms of the research that’s already come out saying that people self-censor their search queries when they think that their search queries are going to somehow be monitored or logged by the government. You could envision a world where if people think they no longer have privacy and security over their communications with each other, or their files that they have on their phone or remotely, that they just stop thinking or saying or doing controversial thoughts or statements. That would be a huge loss.

Cindy Cohn:
So let’s flip it around a little bit because we’re fixing the internet here, not celebrating its brokenness. So, what are the values that we’re going to get if we get this right?

Riana Pfefferkorn:
We’re talking about data security, I think we often think of it as does this protect my private thoughts or less popular opinions? But it would protect everything. That’s all the good and bad stuff that people do online.
But I think there will be a side effect to improving the security of everything from your e-commerce or online banking, to the security of our communications that we have as if you are an attorney, I think there’s a lot to be said for having stronger encryption for your communications with your clients and in other privileged contexts, whether that is your online therapist or e-health.
Instead of a move fast and break things, it’s almost a move fast and fix things, where encryption has become more and more ubiquitous just by simply turning it on by default as choices that have been made by the same large providers that, while they are rightly subject to critique for their privacy practices or antitrust practices, or what have you, nevertheless have, because they have such massive user bases, done a lot for security simply by stepping their game up when it comes to their users.

Danny O’Brien:
Yeah. We have this phrase at the EFF, which is the tyranny of the defaults, where you get stuck in a particular world, not because you don’t have the freedom to change it, but because everyone gets the default settings which exclude it. It would be great to flip that around in this utopia so that the defaults actually are on the side of the user rather than the people who want to peer into this.

Danny O’Brien:
What discussions would we be having if that was all behind us? What are the subtler debates that you want to get on to and that we would have in our beautiful utopia?

Riana Pfefferkorn:
I mean one thing would be just what does a world look like where we are not privileging and centering the interests of the state above all others? What does it look like to have an internet and devices and the digital environment that centers users and individual dignity? What does that mean when individual dignity means protection from harm or protection from abuse? What does it mean when individual dignity means the ability to express yourself openly, or to have privacy in your communications with other people?
Right now, I think we’re in a place where law enforcement interests always get centered in these discussions. I think also, at least in the United States, there’s been a dawning recognition that the state is not necessarily the one that has a monopoly on public safety, on protection, on justice, and in fact has often been an exponent of injustice and less safety for individuals, particularly people from communities of color and other marginalized groups.
And so, if we’re looking at a world that needs to be the right balance of safety and free and liberty and having dignity, there are a lot of different directions that you could go, I think, in exploring what that means that do not play into old assumptions about, well, it means total access by police or other surveilling authorities to everything that we do.

Cindy Cohn:
Oh, what a good world. We’re still safe. We have safety. As I’ve said for a long time, you can’t surveil yourself to safety, that we’ve recognized that and we’ve shifted towards how do we give people the privacy and dignity they need to have their conversations. think I’d be remiss if I didn’t point out like police solved crimes before the internet. They solve crimes now without access to breaking encryption. And I think we said this at the beginning. It’s not like this is blocking police. It might be making things just slightly slower, but at the cost of, again, our security and our dignity.
So I think in our future, we still solve crimes. I would love to have a future without crimes, but I think we’re going to have them. But we still solve crimes. But our focus is on how do we empower users?

Riana Pfefferkorn:
Right. I think it’s also easy in these discussions to fall into a technological solutionism mindset, where it’s not going to be about only having more police powers for punitive and investigative purposes, or more data being collected or more surveillance being conducted by the companies and other entities that provide technology to us, that provide these media and devices to us, but also about the much harder societal questions of how do we fix misogyny and child abuse?

And having economic safety and environmental justice and all of these other things? Those are much harder questions, and we can’t just expect a handful of people in Silicon Valley or a handful of people in DC to solve all of our way out of them.
I think it almost makes the encryption debate look like the simpler avenue by comparison, or by looking solely towards technological and surveillance-based answers, because it allows the illusion of those harder questions about how to build a better society.

Cindy Cohn:
I think that’s so right. We circle back to why encryption has been interesting to those of us who care about making the world better for a long time, because if you can’t have a private conversation, you can’t start this first step towards making the change you need to make in the world.

Well, thank you so much for coming to our podcast, Riana. It’s just wonderful unpacking all of this was you. We’re huge fans over here at EFF.

Riana Pfefferkorn:
Oh, the feeling is mutual. It’s been such a pleasure. This has been a great conversation. Thanks for having me.

Danny O’Brien:
Well, as ever, I really enjoyed that conversation with one of the key experts in this area, Riana Pfefferkorn:. One of the things I liked is we touched on some of the basics of this discussion, about government access to communications and devices, which is really this idea of the backdoor.

Cindy Cohn:
Yeah, but it’s just inevitable, right? I love the image of a crack in the windshield. I mean once you have the crack in there, you really can’t control what’s going to come through. You just can’t build a door that only good guys can get in and bad guys can’t get in. I think that came really clear.
The other thing that became really clear in listening to Riana about this is how law enforcement’s a bit alone in this. As she pointed out, the national security folks want strong security, they want it for themselves, and the devices that they rely on when they buy them off the shelf, that consumer protection agencies want strong security in our devices and our systems, because we’ve got this cybersecurity nightmare going on right now with data breaches and other kinds of things.
And so, all of us, all of us who want strong security are standing on the one side with law enforcement, really the lone voice on the other side, wanting us to have weaker security. It really does make you wonder why we keep having this conversation, given that it seems like it’s outsized on the one side.

Danny O’Brien:
What did you think of the First Amendment issues here? Because I mean you pioneered this analysis and this protection for encryption, that code is speech and that trying to compel people to weaken encryption is like compelling them to lie, or at least compelling them to say what the government wants. How does that fit in now, do you think, based on what Riana was saying?

Cindy Cohn:
Well, I think that it’s not central to the policy debates. A lot of this is policy debates. It becomes very central when you start writing down these things into law, because then you’re starting to tell people you can code like this, but you can’t code like that, or you need a government permission to be able to code in a particular way.
Then that’s where we started in the Bernstein case. I think, once again, the First Amendment will end up being a backstop to some of the things that law enforcement is pushing for here that end up really trying to control how people speak to each other in this rarefied language of computer code.

Danny O’Brien:
We always like talking about the better future that we can get to on the show. I liked Riana’s couching of that in terms of, first of all, the better future happens when we finally realize this conversation is going around in circles and there are more important things to discuss, like actually solving those problems, problems that are really deep and embedded in society that law enforcement is really chasing after.
I like the way that she conveyed that the role of technology is to really help us communicate and work together to fix those problems. It can’t be a solution in its own right.
It’s not often that people really manage to successfully convey that, because to people outside, I think, it all looks like tech solutions, and there’s just some it works for and some it doesn’t.

Cindy Cohn::
Yeah. I really appreciated the vision she gave us of what it looks like if we get this all right. That’s the world I want to live in. Thank you so much to Riana Pfefferkorn for coming on the show and giving us her vision of the utopia we could all have.

Danny O’Brien::
And thanks to Nat Keefe and Reed Mathis of Beat Mower for making the music for this podcast. Additional music is used under creative commons licence from CCMixter. You can find the credits for each of the musicians and links to the music in our episode notes.

Please visit eff.org/podcasts where you find more episodes, learn about these issues, donate to become a member, as well as lots more. Members are the only reason we can do this work, plus you can get cool stuff like an EFF hat or an EFF hoodie or even an EFF camera cover for your laptop.

“How to Fix the Internet” is supported by The Alfred P. Sloan Foundation’s Program in Public Understanding of Science and Technology.

I’m Danny O’Brien:.

Cindy: And I’m Cindy Cohn.

 


Fight Censorship, Share This Post!

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.