Pride Resources for Activism in Digital and Physical Spaces

Fight Censorship, Share This Post!

In June, people honor one of the key events that ushered the era of LGBTQIA+ Pride—Stonewall—during which Black and Brown trans and queer people led a riot in direct response to police brutality. This year, Pride occurs during national and global protests over the continued murder of Black people, and highlights disparities around race, gender, ability and identity, with people at these intersections experiencing particular stress, such as the unprecedented dangers for Black trans women. In this moment of solidarity with Black, trans and disabled activists demanding justice for the killings of Tony McDade, Layleen Cubilette-Polanco, George Floyd, Breonna Taylor, and others from historically targeted communities, we are sharing resources to help activists and others protect their digital security. While there are known and established practices for protecting people against physical threats and harassment during strictly in-person gatherings, digital gatherings involve a different set of security considerations. This guide offers an overview and further reading that, we hope, will help activists think  about how to adapt their work as digital considerations increasingly touch physical spaces. 

This guide is divided into three sections: 

Preface:

Please note that the authors of this post come from primarily U.S.-based experiences and language. This blogpost is by no means comprehensive, and it should be noted that digital security risks (such as available surveillance equipment), and their mitigation, can vary depending on your location and other contexts. COVID-19 responses will also be relevant; We do not cover physical security nor health considerations, such as care for people, wearing masks, social distancing. Please consult health organizations like the CDC, as well as Black disability-centered and trans-centered resources, for tips on being conscientious. Additionally, your individual assessment and approach may be highly dependent on your unique risks. For example, for some people, being identified during their activism is less of a risk, where for others, being identified would cause significant stress and/or put them at risk of reprisal from employers, co-workers, family, or strangers.

Identification and Harassment

One of the consistent themes that we discuss in our digital security work is how to protect your identity—whether that’s using a different name, preventing association while engaging in protests, deliberately complicating the data used to track you, or taking and sharing photos safely. Unfortunately, folks in targeted communities are painfully aware of the need to protect their identities: from threats like doxxing (publishing private or identifying information about you on the Internet); surveillance by law enforcement; corporate surveillance; and/or digital, financial and physical harassment. 

Let’s imagine one journey: you attend a protest with friends and take photos, and then you post the photos to a platform like Twitter, using a hashtag to promote their visibility. What security issues might you consider? 

Security for Physical Spaces, such as Protests and Rallies

In order to assess how best to defend against surveillance at public gatherings, it’s important to know what surveillance tools are being used. At EFF, we spend a lot of time writing and researching law enforcement surveillance. For information on what these surveillance technologies look like, check out our post on Identifying Visible (and Invisible) Surveillance at Protests.

Here’s a short overview of problems presented by relevant technologies during a protest, and some ways people mitigate against these problems:

ALPR atop a police vehicle

Photo by Mike Katz-Lacabe (CC BY)

Automated License Plate Readers, or ALPRs—ALPRs are computer-controlled cameras, frequently mounted on police cars and street poles that are used to scan and track the license plates of vehicles that are present in a given area. They record the plate, time, date, and location for every vehicle that passes the camera. This is why many people opt not to drive a car to attend large in-person gatherings. You can see our Street-Level Surveillance resource for a primer on how ALPRs work.

Image recognition technologiesfacial recognition, iris recognition and tattoo recognition technologies are used by law enforcement to analyze large image sets of people in public spaces (such as from networked camera systems) as well as on social media accounts. It’s a reason why some people prefer to cover their face and tattoos, or to limit identifiers, as well as one reason why people may have changes of clothing. Biased algorithms in image recognition technologies further exacerbate the experience of already-surveilled communities, such as the use of gender-identification facial recognition. For more information on image recognition  technologies, check out our educational resource on Street-Level Surveillance.

Two faces being compared side-by-side on a computer

Facial recognition image source: Arizona Department of Transportation

Social media scraping and image recognition technologies are also a reason why citizen photographers ask for explicit consent when taking photos of protesters, and photograph people in less identifying manners (like from behind, in a crowd shot).  

For specific scenarios, like ethical documentation of police brutality, read EFF’s post on first amendment protections in filming the police and tips for journalists, as well as these tips from WITNESS.

Credit and debit cards—credit cards are used as persistent long-term identifiers, and a reason why some prefer to carry cash when attending a gathering close by.  This information can also be connected to other identifiers, like tickets for public transit.

Phones—One big consideration is the little computer we carry with us: our mobile devices.  For protests, phones can be an important tool in documentation, communication and broadcasting; however, without proper protections, they can also be used by different groups—ranging from third parties, to law enforcement, to app companies— to track people. Mobile phone surveillance has a few overlapping areas of concern, relating to the hardware on the physical device, whether features like Bluetooth, Wi-Fi and Location Services are on, what types of encryption are used, and so on. People make different choices based on who they believe may be surveilling them.

International Mobile Subscriber Identifier (IMSI) numbers are unique identifiers that point to the subscriber themselves, and they are shared with your cell provider every time you connect to a cell tower. As you move, your phone sends out pings to nearby towers to request information about the state of the network. Your phone carrier can use this information to track your location (to varying degrees of accuracy). There’s also an added risk—devices called cell-site simulators pretend to be network towers, and are employed by law enforcement to identify IMSIs in a given geographic area. 

For a primer on mobile device identifiers, check out this section of our illustrated third-party tracking white paper. For more information on how cell-site simulators work, check out EFF’s post on Cell Phone Surveillance at Protests or check out our illustrated white paper.

It’s for these reasons that many people opt to leave their phones at home, to bring a burner device, or to set their phones on airplane mode. Yet for many people, access to a phone may be their primary technology. Your choices depend on assessing your risks and needs, and making a choice that’s right for you.

While we’re on the topic of mobile devices, there are a few other risks to be aware of:

Unfortunately, there are significant examples of phones being seized. Many people choose to simply assume that their phone will be taken and unlocked. They prepare for that event by, for example, opting to log out of accounts or uninstall their apps, disable app notifications, take photos without unlocking their device, and set their photos to backup automatically. These considerations are especially important for folks who may be concerned about their app activity in precarious legal contexts. Law enforcement has been known to use the presence of an app to target people (e.g., gay dating apps) and any app content they see through notifications or within the app can be used as evidence for greater scrutiny and illegal activities (e.g., any client screening or accounting information a sex worker stores on their phone). You can read more in our Surveillance Self-Defense guide, The Problem with Mobile Phones.

For specific considerations on digital security during a protest, please read our comprehensive Surveillance Self-Defense guide on Attending a Protest for in-depth tips, as well as our companion post to this guide on protests during COVID-19.

Your device, your network

With social movements, people may encounter censorship through passive or intentional network disruption. In various countries, governments have cracked down on people’s use of mobile phones by slowing or shutting down the internet during large gatherings, though it can be difficult to figure out how a network is disrupted during the fact. In the US, internet shutdowns are less of a possibility: network overload is more probable, due to the abnormal amount of people in a concentrated area connecting at once to nearby cell service. Regardless of the cause, the effect is that network disruption inhibits quick information sharing.  Given this risk, it may be helpful to create a plan for backing up images and videos, and sharing them later, on a faster connection. 

For more information on how censorship and network connectivity works, check out our Surveillance Self-Defense guide on Understanding and Circumventing Network Censorship, and my colleague’s post on cell phone surveillance at protests

In other cases, especially in LGBTQIA+ communities of color and sex worker communities, censorship may occur on the platforms they use. This can take the form of censoring social media posts and hashtags, as documented in our Online Censorship project. Additionally, some communities worry that platforms may “shadowban” content that’s deemed illicit or inappropriate. Though there is little evidence of deliberate platform shadowbanning,  a Facebook patent has increased concerns.  In any event, there is ample evidence of private platform censorship as a general matter.

In the next section, we’ll explore considerations for digital spaces, like sharing images from a protest, or creating digital spaces to come together.

Considerations for Digital Spaces, such as Posting on Social Media and Digital Gatherings

If you are concerned about protecting your digital information while participating in digital spaces and online activism, consider the following issues.

The Data You Include (or Don’t Mean to Include)Posting content can be risky. For example, you might accidentally out someone’s participation in a protest or online gathering by mere association. An additional thing to consider is that LGBTQIA+ specific digital and physical gatherings not only provide space for those who are out, but space also for those who are not yet out to their families, workplaces and so on. That is why many people are careful about their digital associations, such as being mindful of what account they’re logged into when RSVPing to events, or being careful to not be tagged in photos and posts. In many contexts (and particularly for immigrants and people who have to deal with punitive laws in their countries), it could be quite dangerous for someone to be recognized in photos with LGBTQIA+ symbols. 

Be mindful of the data you share, particularly as it relates to other people; it’s a good process to ask for consent when including other people in your posts. This is a reason citizen photographers obscure people’s faces when posting pictures of protests online (like using tools such as Signal’s in-app camera blur feature).

When posting, be mindful of metadata—or additional information included along with your data, that provides more details about your situation.  Metadata of the videos and photos you post, as well as virtual “check-ins,” can include sensitive information such as locations, time the photo or video was taken, the equipment used and so on. Be mindful that location sharing is not enabled, and that you are not including sensitive metadata when posting videos from scenes like protests. For a good example of EXIF data that accompanies photos and videos, check out Freedom of the Press Foundation’s primer on media metadata and their piece on redacting photos.

Public, PrivateSocial media settings on your accounts can be helpful for separating an account’s visibility.  Switching an account to a Private mode, limiting comments, or using social media provided tools such as blocking and reporting mechanisms can provide some protection. You can follow Surveillance Self-Defense’s tips for protecting yourself on social media networks.

A tricky consideration is your account’s visibility may increase as you engage in activism. Keep this in mind when sharing content that features other people — they may have different security considerations that may not be immediately apparent, and that what might be a safe activity for you, may open someone else to new and unnecessary targeting. For example, asking for consent before mentioning or tagging a friend in a public post is a considerate practice.

Being Sensitive to Misinformation—Unfortunately, it’s a pressing and difficult social challenge to differentiate misinformation tactics, such as videos or sock puppet accounts that utilize AI-generated faces to seem legitimate, or massive harassment campaigns targeting people based on their identities. A number of social media services have content moderation systems; however, as marginalized communities are particularly aware, these systems can be weaponized by those with ill intent. Misinformation is a shifting space—Data & Society has research on identifying and mitigating against misinformation, which can be helpful for folks looking to recognize patterns in how misinformation tactics are used. 

The Names You Use and the Spaces You’re In—Many LGBTQIA+ creators are already skilled at managing identities in physical spaces. To mitigate against harassment and doxxing, it’s helpful to compartmentalize digital identities as well. For an excellent primer on LGBTQIA+ considerations for managing these identities, read Norman Shamas’s post on the topic

This might mean taking stock of the different services you use, which usernames and passwords are used for these services, and where your names are mentioned. A task that’s particularly tedious is cutting the tie from a legal birth name to a chosen name. For trans folks, removing all associations with chosen names to dead names is especially fraught. Legal name changes and difficulties—such as the legal requirement to publish name change—are one component; Data brokers are another that make this task incredibly daunting. If you’re doing this yourself or with friends, or are considering paid options for opting out of data broker websites. Journalist Yael Grauer has an updated document of resources.

It’s incredibly stressful to be on the receiving end of harassment, which is why some people choose to get the support of friends who can help them with moderating comments and making appropriate choices. Where possible, consider taking steps to mitigate against online harassment as a pre-emptive measure (for example, as part of your plan before going to a protest), rather than a reactive step: For more information on mitigating against online harassment, check out Access Now’s guide on Self-Doxxing, or Feminist Frequency’s guide, Speak Up and Stay Safe(r).

One opportunistic way people gain access to people’s accounts is to look for a leaked password from a major breach and to try to find if it’s used on other services—this is why it’s important to use unique, random, and long passwords for every service. For tips on how to make stronger passwords, as well as how to remember all these unique passwords and accounts, check out our Surveillance Self-Defense guide on password managers.

The Tools You UseAs part of the work of compartmentalizing digital identities to mitigate against harassment, you may want to consider the specific use cases for your devices, online accounts, and the browsers you use to access those accounts. For example, say you have a performance persona that’s geared toward a wide audience, and a private profile you use among friends. You might be careful to avoid cross contamination of data between these profiles, like not reusing photos. For example, someone looking to compartmentalize their social media identities might only access their public performance account using a VPN and a specific browser, clearing their session afterwards, and maybe use a different browser on a different device on their regular network for their personal private account. 

Adding Barriers—Mitigating against someone who might be intent on targeting you with harassment is exhausting, and can feel like trying to outrun a bear. You can add barriers to make it harder to disrupt your online life, such as by adding passwords to ordinarily password-less services. For example, if you are running a large video call, consider enabling security settings and creating a process to remove opportunistic people disrupting the call. We have a guide on hardening security settings in Zoom, which might be useful for folks holding large virtual celebrations or actions on a video conferencing platform.

Additionally, if two-factor authentication is available, use it, as it provides extra protection for your accounts. Two-factor authentication is something you know and something you have: this means an adversary doesn’t just need to guess or obtain your password, but also they’d physically have to obtain something to get access to your account. Where possible, use app-based two-factor authentication or hardware-based two-factor authentication, rather than SMS-based. Learn more about how to enable two-factor authentication at Surveillance Self-Defense, check out twofactorauth.org for an overview of which services offer two-factor authentication, or follow our Security Education Companion one-page handout on the topic. 

Getting Help—For activists, journalists, bloggers and civil society members around the world, Access Now has a 24/7 Helpline that provides digital security advice and recommendations. If you’re based in the US and are participating in protests, EFF has legal referral services for protesters

We’ll be putting together more posts in acknowledgment of Pride this month. Join us for our Pride edition of our At Home with EFF livestream on Thursday, June 25th, at 12pm PDT.

We’d like to thank Slammer Musuta of Pumzi Code, Norman “grumpy pants” Shamas, Sarah Aoun at the Open Technology Fund, Sage Cheng at Access Now, Eriol Fox, and Martin Shelton at Freedom of the Press Foundation for their contributions and edits.


Fight Censorship, Share This Post!

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.