For more than a decade, both amateurs and professionals shared their sometimes sweet, sometimes weird, and often graphic sexual activity on Pornhub. Launched in 2007 not long after YouTube and with a similar free-for-all spirit, the site represented a new wave of “adult entertainment” in which anyone with an internet connection could partake and anyone with a digital camera could become a star.
Dubbed “tube sites,” Pornhub and its various peers began to dominate web traffic generally and porn consumption specifically. These sites trod on porn’s established business model, but for savvy sex workers the tube site network could provide a way to break into the business or reach audiences directly, without the porn industry’s usual middlemen. To monetize one’s presence in the early days took some creativity, but tube sites would eventually offer content partnerships that allowed people to get paid directly for their videos. Their competitors, such as cam sites and clip stores, made the process of charging money and getting paid even smoother.
The result? For the first time, people with a truly diverse array of body types, looks, races, ethnicities, sexualities, gender identities, and kinks had direct access to the tools of porn production and distribution. In the past, porn had catered to a much more narrow range of tastes, with predictable results. Now audiences could access all sorts of content that defied conventional notions of who and what was deserving of lust. On sites like Pornhub and the microblogging platform Tumblr, outside-the-mainstream content thrived.
And then, one day, it was gone.
In December 2020, without warning, Pornhub removed all videos posted by unverified users—a massive cache of content encompassing anything not posted by formal content partners or members of the platform’s official model program. More than 10 million videos were suspended, and unverified users were banned from uploading or downloading new videos.
It was more than a disruption to the site. The unannounced disappearing of so many videos was “a huge cultural loss,” says Ashley, a transgender sex worker and civil rights activist with a robust presence on social media and in offline organizing. (At Ashley’s request, we’re identifying her by first name only.) Ashley volunteers with the Sex Workers Outreach Project (SWOP) Behind Bars, a group dedicated to helping incarcerated sex workers. She recently helped spearhead a campaign protesting financial discrimination against sex workers and LGBTQ content creators. Unverified videos, Ashley says, are “inclusive, just by definition, of all the queer content that people felt unsafe with being directly affiliated with.”
The Pornhub purge came about two years after Tumblr’s ban on any content depicting sex acts, and preceded a similar announcement in summer 2021 from OnlyFans, a subscription content site popularized by sex workers. OnlyFans would later reverse this edict, but the fate of adult content on the site remains uncertain.
Then, in September 2021, the first user-uploaded porn site—Xtube, founded in 2006 and now owned by the same parent company as Pornhub—shut down entirely.
Demand for online porn hasn’t weakened, at least not according to web traffic numbers. Nor do there seem to be fewer people willing to create and post it; it’s not uncommon to hear sex workers complain about the glut of adult content creators these days.
Nonetheless, it’s a financially precarious, and perhaps even dangerous, time to be in the business of online porn. And one of the biggest reasons why is that a constellation of activist groups, rooted in deeply conservative opposition to virtually any depiction of sexuality in the public sphere, have put considerable pressure on the middlemen who keep online porn in business. In some cases, that pressure has led to the creation of onerous new laws; in others, it has been aided by support from powerful figures in business and government. These groups have repeatedly sought to conflate the existence of consensual commercial sex and porn production with the prospect of forced sexual exploitation, often with lurid statistics about exploited minors that don’t stand up to scrutiny.
Although these groups say their aim is merely to rid the web of abuse, it’s clear that their true goal is to eliminate the vast majority of adult sexual content from the web through a combination of legal pressure tactics, lobbying for new laws, and political intimidation. It’s a campaign for a sex-free web. Rather than help vulnerable women, these efforts threaten to make life worse for the very people they claim to want to help—while simultaneously stifling internet expression more broadly.
From ‘Morality’ to ‘Exploitation’
Few organizations have done as much to try to squelch online porn as the group that for most of its life was known as Morality in Media. The group was founded in 1962 to fight countercultural influences, especially those with sexually explicit material. In 1969, for example, it went after underground newspapers for “obscenities” and “push[ing] drug usage as the ‘in’ thing.” In 1971, its target was “titillating ads in the U.S. mails,” along with “smut in media”—including “nudie, homosexual, sado-masochistic and teen-age sex books”—that might be “inciting our nation’s youth to violence, perversion, promiscuity, drug experimentation, hatred and tastelessness.”
By the early ’80s, the group was bemoaning adult bookstores, soap operas, and MTV. “Really and truly, soap operas are destroying the family’s moral base,” its president said in 1984. In the ’90s, it railed against daytime talk shows and sitcoms depicting sex outside marriage. The specific nature of the threat was always shifting, but the core crusade was always about mass media portrayals of sexual activity that didn’t align with traditional values.
In 2015, the group rebranded as the National Center on Sexual Exploitation (NCOSE). Since then, the internet and tech companies have become its primary targets. Search engines, social media, online classified ads, digital marketplaces, and streaming video services have all found themselves under fire, along with online pornography platforms like Pornhub and OnlyFans.
Today, the group tends to trade the language of “decency” and morality for feminist-tinged talk of consent, objectification, violence against women, and sex trafficking. Pornhub “normalizes themes of racism, incest, and violence against women,” NCOSE said in a 2019 press release. HBO profits “from sexual objectification, exploitation, and violence,” it declared in 2016. NCOSE describes its work broadly as “exposing the links between all forms of sexual exploitation such as child sex abuse, prostitution, sex trafficking, and the public health harms of pornography.”
Underneath it all, though, NCOSE is still the same old musty conservative values group aimed at eradicating sexuality in the public sphere. It cloaks that under a mantle of saving the children, and it uses intimidation and legal pressure to get what it wants.
In recent years, the group’s annual “dirty dozen” list has condemned the Sports Illustrated Swimsuit issue for “sending a message that women’s bodies are for public consumption,” Cosmopolitan magazine for “hypersexualized cover models,” Seattle coffee stands for having scantily clad baristas, Amazon Prime Video for showing “simulated sex scenes,” and Netflix for featuring “gratuitous amounts of nudity.”
“When Netflix, a highly influential platform with over 200 million users across the globe, hosts sexually explicit content like ‘Cuties,’ ‘Big Mouth,’ and ‘Sex Education,’ it deserves to be called out for profiting from sexually exploitative content,” says NCOSE CEO Dawn Hawkins. “Sexual exploitation is not entertainment.”
NCOSE is one of a handful of influential groups intent on recasting a wide range of sexual content and activities as “exploitation.” It’s joined by groups such as Exodus Cry, which was born out of an evangelical Christian church in Kansas City and bills itself as foe of “commercial sexual exploitation”; the Justice Defense Fund, a lobbying and litigation group founded by the anti-porn activist Laila Mickelwait; and Demand Abolition, an anti–sex work group founded by the oil heiress and Clinton-era ambassador Swanee Hunt.
Though they speak the language of feminism, these groups are steeped in the spirit of conservative purity culture—an evangelical ethos popularized in the 1980s and ’90s. Purity culture hinges on abstinence rituals like virginity pledges, chastity rings, and father-daughter “purity balls.” It’s predicated on the notion that sexual activity should be relegated to monogamous and heterosexual married couples, and it preaches strict gender roles, female modesty, and total abstinence from premarital sex. It often rests on the idea that promiscuity not only destroys a woman’s value as a partner but her emotional stability and self-worth.
Some prominent anti-porn activists spring directly from this world. Exodus Cry founder Benjamin Nolot has distanced himself and his organization from the group’s evangelical roots, but he became known for giving talks like “contending for purity in a pornified world,” in which he defines sexual immorality as “all sexual activity outside of the marriage covenant between one man and one woman.” Others come from a radical feminist background that eschews gender norms and embraces queerness yet sounds strangely like its religious right counterpart when it comes to sex work. In both frameworks, women who participate in porn are ruined. Men who watch porn are damaged. Porn “kills love” and threatens the well-being of American women and families.
A shared goal of these groups is to remake the internet as a sex-free zone by casting a vast swath of nontraditional sexual activity as “sexual exploitation” or “human trafficking,” especially if it involves the transfer of money, even indirectly. “Any content that turns people into public sexual commodities has no place on the Internet or in society,” Hawkins says.
This strategy has had remarkable success, earning an audience and acclaim among reporters, politicians, and prominent feminists unlikely to be so kind to a band of moralistic Bible-thumpers denouncing promiscuity and calling sex outside marriage a sin. The purity culture ethos of shame, abstinence, and fallen women still permeates these groups’ activism. But it’s been repackaged as a bid to protect women and kids from trauma and sexual harm rather than to uphold the sanctity of marriage and biblical womanhood.
A central plank of this strategy is litigation.
In January 2021, NCOSE helped bring a lawsuit accusing Twitter of sex trafficking. The basis for this claim is that the social media site temporarily hosted a link to a video, hosted on a separate site, featuring two teenagers engaged in sex acts. The minors had taken the video themselves and shared it with a third party via Snapchat. In August, a judge ruled against Twitter’s motion to dismiss the case.
In February 2021, NCOSE helped bring a lawsuit against MindGeek, the parent company behind a number of porn sites, including Pornhub. In the suit, which is also ongoing, two Jane Does accuse Pornhub of hosting videos without their consent. And in March, NCOSE helped bring a lawsuit against WebGroup Czech Republic, the company behind one of the world’s most visited porn platforms, XVideos.
In all of these cases, an underlying kernel of harm is alleged, such as a teen being blackmailed into sending a stranger sex videos or women being duped into appearing in online porn. But rather than target the perpetrators of that harm directly, the NCOSE strategy is to go after platforms that—however briefly or unknowingly—hosted evidence of it taking place.
None of these suits would have a chance at success without the Allow States and Victims to Fight Online Sex Trafficking Act of 2018 (FOSTA), a law that NCOSE backed. In addition to making it a federal crime to host content that facilitates prostitution, FOSTA amended the federal statute known as Section 230—which says that individuals and intermediaries online aren’t always legally liable for content, interactions, and transactions by clients or users—to make it easier for private citizens and state attorneys general to sue digital intermediaries.
Digital intermediaries include everything from Facebook and Twitter to Pornhub and XVideos to search engines, Substack, cloud hosting companies, dating apps, video chat platforms, web payment processors such as PayPal and Stripe, and any other website or app that serves as a conduit for content, communication, or trade.
The goal of both FOSTA and the NCOSE lawsuits is to change the Section 230 paradigm when it comes to sex. The strategy involves first recasting sex trafficking. Legally, this is prostitution that involves minors and/or force, fraud, or coercion; in the popular imagination, it necessarily involves violence, abduction, and rape. The crusaders want to make it mean essentially any activity that involves sex work, even between consenting adults, or any sexual activity involving minors, even if there is no commercialization and even if intermediaries facilitating its exposure have no reasonable expectation of knowing about it.
At its core is the idea that sex work can never just be work; it’s always exploitation. Hawkins says as much: “That sex buyers must pay to sexually access the bodies of others demonstrates that the sex in prostitution is unwanted by those being paid. Payment, whether in cash or by other things of value, is the leverage used to abrogate the lack of authentic sexual desire of those in the sex trade.”
Additionally, any third party profiting from sex—no matter how indirect or inconsequential—counts as exploitation. That’s the crux of the Twitter lawsuit: NCOSE’s argument is that because Twitter runs ads alongside all content, it profited from the tweet sharing footage of teens engaged in sex acts, and therefore it violated federal law against child sex trafficking.
Under this logic, it’s incredibly risky—reputationally, legally, and financially—for online intermediaries to allow any sort of sexualized business or content. No company wants a reputation for supporting exploitation, sex trafficking, and child abuse. And hosting sex-business transactions risks FOSTA-enabled lawsuits and abandonment by credit card companies and banks.
In other words, these groups have gone after online sex work and pornography by making it difficult, if not impossible, for sexually oriented businesses to process payments and collect money for services rendered—if they can create accounts at all. These tactics threaten the entire porn industry and the livelihoods of thousands of sex workers. Online sex work is, after all, work: If you can’t collect a paycheck or bill your clients, you can’t do your job.
Creating Chokepoints
To that end, activists have been pressuring financial institutions—credit card companies, banks, etc.—not to do business with sex workers, sexually oriented businesses, or any intermediary that won’t discriminate against these groups.
This method was tested with the classified advertising platform Backpage. In 2015, activists used the press and public relations campaigns to pressure credit card companies to stop processing Backpage transactions.
But it wasn’t simply an activist pressure campaign: Cook County, Illinois, Sheriff Tom Dart, who has staked out one of the nation’s most aggressive stances against sex work, threatened action against these companies if they didn’t stop. After Dart’s threats, Mastercard and Visa both quickly ditched Backpage. A federal judge would later rule Dart’s actions unconstitutional, because they violated the First Amendment, but the damage was done.
The Backpage situation proved that popular pressure and the mere threat of sex trafficking lawsuits could work as well as, if not better than, government mandates. It’s a playbook activists are now repeating with companies like Pornhub and OnlyFans.
Private campaigns to change business practices are a vital freedom. And private businesses can “censor” or choose not to associate with whomever they want. But that doesn’t mean these actions are always a social good, nor beyond criticism. More importantly, porn’s enemies aren’t simply speaking out privately. They are also calling for, and in some cases successfully generating, legal and political sanctions.
It’s true that NCOSE is not the Department of Justice (DOJ). An Exodus Cry petition isn’t an executive order. But neither are these groups simply calling on people to boycott Pornnub or delete their Twitter accounts. They’re calling on the DOJ and members of Congress to act against them, and they’re filing lawsuits that threaten serious court-ordered consequences for these companies. These demands for state action have proven influential.
Take FOSTA. NCOSE backed the law and has taken credit for its passage. The group has alternated between appeals to women’s liberation (calling it a “test of the strength of our national resolve to deliver on the promise of #MeToo”) and appeals to saving the children (“today, ordering a child or adult online for sex is as easy as ordering a pizza”). NCOSE is now pushing another law to weaken Section 230 protections, called the Eliminating Abusive and Rampant Neglect of Interactive Technologies (EARN IT) Act, which eliminates Section 230 protections for material involving minors.
And sometimes they are clearly backed by state actors with real power, as Sheriff Dart’s campaign against Backpage shows. Dart and Demand Abolition, notably, used to partner up for a series of prostitution stings called the “National Johns Suppression Initiative.”
The threat to tech companies now is legal trouble not for failing to uphold current criminal justice norms but for failing to proactively define sex trafficking, exploitation, and obscenity as broadly as these groups would like them to be defined—and as these groups may eventually succeed at convincing lawmakers and courts to define them.
“What it comes down to really,” tweeted Gustavo Turner, an editor at the adult industry publication XBIZ, “is that there’s a well-funded, well-organized group of people working 24/7 to align the state’s definition of ‘crime’ with their own notion of ‘sin.'”
After FOSTA passed with the promise of taking down online classified ad venues, activists started focusing on other user-generated content platforms. Mickelwait’s “TraffickingHub” campaign took aim specifically at Pornhub. And as with the crusades against classified ads, former New York Times columnist Nicholas Kristof amplified this crusade. (Kristof left to run for governor of Oregon, but was ruled ineligible because of the state’s residency requirements.)
In a highly sensationalistic December 2020 column, Kristof accused Pornhub of being complicit in rape and child abuse. To make this argument, Kristof relied heavily on Pornhub keyword searches and faulty assumptions. Ambiguous words and phrases like teens and young are taken to mean minors, even though these words often refer to young adults or are used to tag role-playing videos featuring adults who are actually much older. Scenes featuring “nonconsensual” encounters—another popular role-playing category—are likewise taken as indications of literal rape.
Kristof fleshes out these keyword insinuations with anecdotes from young women like Serena K. Fleites, who as a young teen took naked videos of herself, shared them with a boy, and wound up on Pornhub. She’s now at the center of a class-action lawsuit against Pornhub’s parent company, MindGeek. Fleites’ is one of several tales Kristof relays in which videos were removed by Pornhub when notified, only to be reposted on Pornhub or other websites. Their stories showcase the perils of modern digital adolescence, when intimate images shared with other teens or exploitative adults can wind up living forever and recirculating endlessly online. What they don’t suggest is a problem unique to Pornhub, since the videos often circulated around the internet. Nor do they reveal a company indifferent to underage or nonconsensual pornography.
“Any assertion that we allow CSAM”—that stands for child sexual abuse material, the new officialese term for sexualized content featuring anyone under age 18—”is irresponsible and flagrantly untrue,” protested Pornhub in a statement. It went on to point out that an Internet Watch Foundation analysis has found only “118 incidents of CSAM on Pornhub in a three year period.” This is out of millions of videos—around 13.5 million before the purge, according to Vice.
Data from the National Center for Missing and Exploited Children shows MindGeek reported 13,229 instances of potential underage content to the group’s tipline in 2020—far more than many tech companies, but far less than such mainstream platforms as Google (which submitted 546,704), Imgur (31,571), Facebook (20,307,216), Microsoft (96,776), Snapchat (144,095), TikTok (22,692), and Twitter (65,062).
None of those numbers offer definitive proof of anything, since they’re a function of how much a service is used and by how many people as well as the company’s proactiveness and internal definitions. But to the extent that online exploitation is a problem, they suggest that porn sites aren’t the chief vectors. Indeed, Kristof’s op-ed even admitted that these mainstream sites may be trafficking in far higher volumes of illegal imagery. Nonetheless, he closed his column by calling on credit card companies to stop doing business with Pornhub.
Kristof’s cry was echoed by an influential hedge fund manager, Bill Ackman, who reportedly convinced Mastercard’s then-CEO Ajay Banga to comply. (Ackman’s crusade has since expanded; he has recently accused Google, Bing, Microsoft, Yahoo, and Twitter of “facilitat[ing] and profit[ing] from the distribution of child rape porn” because they allow links to or search results from porn sites.) Before long, Mastercard, Visa, and Discover suspended business with Pornhub and its parent company, MindGeek. (Visa later resumed business with some MindGeek properties.)
Last summer, Visa and Colbeck Capital were added to a lawsuit filed against MindGeek. “It is believed to be the first Racketeer Influenced and Corrupt Organizations Act (RICO) case that attempts to hold financial institutions accountable for the role they may play in sexual exploitation by processing payments,” the Financial Post reported. If successful, it could pave the way for taking credit card companies to court any time they unwittingly aid in harm.
Many cheer these developments when they affect a business or cause they don’t like. But once the floodgates open, it creates new avenues for legal pressure against any industry, company, or individual who can plausibly be portrayed as dangerous, including political causes and movements.
It’s part of a trend of using “banks as a proxy for state censorship,” Porn Panic! author Jerry Barnett suggested in a September Quillette article. And this trend coincides with other disturbing developments, including “increasingly muscular attempts by democratic governments to censor the internet…the successful linkage of a largely baseless ‘sex trafficking’ narrative with sex work and pornography; and a zero tolerance approach to content platforms that holds them responsible for even a single illegal item of content.”
“Pushing for more aggressive content moderation, especially from infrastructure-like entities like payment processors, web hosts, [content delivery networks], etc, is a terrible idea that will always backfire against marginalized people and social movements,” tweeted Evan Greer, director of the digital rights group Fight for the Future. And “that ship has maybe sailed. There is practically an entire industry now around pushing narratives like ‘Why is XYZ web service hosting ABC terrible thing? This is an outrage!’ (and well-intentioned but misguided journalists happy to uncritically amplify).”
Mastercard Speeds the Erasure
Not long after Kristof’s article came out, Pornhub announced new policies, including the takedown of millions of videos posted by unverified users. Some treated this as a win against exploitation, while others accused Pornhub of simply trying to erase evidence of its wrongdoing. But Pornhub users and creators may see it differently.
Unverified content doesn’t translate to illegal or harmful content. Anyone posting in the early days, anyone wishing to remain anonymous, amateurs with no wish to monetize their videos—all were unverified.
“Unverified on Pornhub just meant that they didn’t want to give their ID to MindGeek,” says Ashley, the sex worker activist. These videos were removed “as kind of a sacrificial altar in the name of keeping payments.” But it wasn’t enough for the credit card companies and Pornhub “got defunded anyway.”
Meanwhile, a huge archive of diverse content was just gone. “Most of the retro movies were washed away,” lamented Steven Underwood at LGBT news site NewNowNext recently. “We lost many scenes, including content starring models who have become synonymous with queer dalliance and exploration.”
When Tumblr ditched sexual content in 2018, people realized that a lot of artistic and archival material was lost, says Ashley. “It’s only stigma against porn—the word porn, the idea of porn being central to a site—that prevents people from realizing that a similar loss of culture just happened on Pornhub.”
Last year, Ashley was one of the organizers of a day of sex worker action dubbed Acceptance Matters, which included protests, online testimonials, and a petition that got more than 2,000 signatures. While targeted at discrimination from the banking and financial services industry in general, the campaign was especially a swipe at Mastercard, which uses the slogan “Acceptance Matters” in its LGBTQ campaign.
In April 2021, Mastercard announced new rules for all adult businesses and content. The rules—which took effect October 15—state that “banks that connect merchants to our network will need to certify that the seller of adult content has effective controls in place to monitor, block and, where necessary, take down all illegal content.” Putting banks in charge of gathering information on and evaluating such policies is no small task, and likely many will determine doing business with adult content businesses isn’t worth it.
Some of the required controls Mastercard offers are that adult businesses must review all content prior to publication—a costly and time-consuming proposal that goes far beyond current practices for mainstream social media and user–generated content based platforms. In addition, they must have “documented age and identity verification for all people depicted and those uploading the content,” a rule that goes beyond what’s required by federal law, under which porn creators are required to keep such records, but web platforms that host them are not.
Mastercard’s policies “will result in a major chilling effect and destruction of many ways of working for sex workers and other impacted parties,” including all queer content creators, argues the Acceptance Matters website. In addition, “all of society suffers from restrictions on consensual sexuality and speech, increases in surveillance, and misdirection of resources that should help the most vulnerable.”
With the Acceptance Matters campaign, “we’re asking Mastercard to live up to their publicly stated goals and promises to [the LGBTQ] community,” Ashley tells me. In LGBTQ outreach efforts, Mastercard is “trying to get us to spend through them, but they’re not doing anything to make sure the card is accepted at our businesses,” she points out. “Like, it’s not that we need rainbow branded cards. It’s that we need basic access to the same basic tools everyone else has, and an end to policies that discriminate against us by targeting a job that we’re more likely to do than anyone else.”
“They’re destabilizing the entire community—even people who are not sex workers—because when your community is defined by sexual orientation, it’s seen as sexual content,” Ashley says. Rules that may make sense for professional porn producers and performers, such as mandatory IDs, “would really suck for fine artists and historians and educators and just average everyday people who deserve a right to be able to post nudity to other adults without being tracked by the state.”
OnlyFans Under Fire
Many blamed Mastercard’s new policy for a July 2021 announcement from OnlyFans that it would stop allowing sexually explicit content. The announcement came as a huge blow to adult content creators. It’s “like Taco Bell deciding not to sell tacos anymore,” commented sex worker and content creator Kimmy Kalani in an August 27 video about the announcement. “We helped build that platform, and they’re just going to kick us to the curb.”
But Mastercard’s new policy had no bearing on the decision, nor was it investor-driven, according to OnlyFans founder and CEO Tim Stokely.
“The change in policy, we had no choice—the short answer is banks,” Stokely told the Financial Times in August. Institutions such as the Bank of New York Mellon Corporation and the U.K.’s Metro Bank would “cite reputational risk and refuse our business,” said Stokely. “JPMorgan Chase is particularly aggressive in closing accounts of sex workers or…any business that supports sex workers.”
OnlyFans reversed course about the policy on August 25, stating that it had “secured assurances necessary to support our diverse creator community” and would “continue to provide a home for all creators.” But the situation highlights how precarious things can be for platforms that want to allow adult creators—and for the creators who rely on them for income.
It’s not just traditional banks and credit card companies aggressively policing adult business. Many online payment processors, such as Square, PayPal, and Google Pay, explicitly reject transactions for adult-oriented businesses and performers, or have been known to close sex worker accounts without warning.
The False Promise of Crypto
When OnlyFans first announced it was banning adult content, Edward Snowden tweeted, “Bitcoin fixes this.” This isn’t a rare notion. For several years, various folks have suggested that cryptocurrency can solve sex workers’ issues with banks and credit card companies. The idea really picked up in 2015, when Backpage, backed into a corner by Sheriff Dart’s pressure on credit card companies, began accepting bitcoin, litecoin, and dogecoin for paid ads. Suddenly, sex worker guides to bitcoin started popping up everywhere. Headlines have declared that “sex work is moving to blockchain payments” and “sex workers are finding freedom in cryptocurrency.” Filmmaker and performer Whitney Moore tweeted last year that “Bitcoin will be the answer when Venmo, PayPal and the like continue to shoot themselves in the foot by cracking down on [sex worker] payments.”
But while bitcoin and other cryptocurrencies might help mitigate issues with traditional banking, they’re far from a panacea.
“Over the last 4 years I have tried in vain to get my customers to pay me in crypto and let me tell you, it’s like pulling teeth,” says adult performer and content creator Allie Awesome. “Tons of sex workers are able to accept crypto, and we would love to, but that doesn’t mean our customers will adopt it.”
Besides, sex workers still need a way to convert crypto payments to cash. “My landlord does not accept bitcoin. The grocery store does not accept bitcoin. We still rely on exchanges and banks,” says Awesome. And “not all [cryptocurrency] exchanges are sex worker friendly….You also need to link your bank to an exchange in order to cash out, and banks aren’t always sex worker friendly either.”
For instance, Coinbase explicitly prohibits businesses engaged in “adult content” from using its services. On top of all that, the rules around cryptocurrency are constantly changing, making its use “somewhat of a gray area legally,” notes Awesome. “It seems like every week there is a new law being passed or the [Securities and Exchange Commission] launches a new investigation.”
And with Democrats pressing to treat cryptocurrency brokers more like traditional financial players, exchanges and other platforms that deal in cryptocurrency may wind up pressured to exclude sex workers, too.
A War on Intermediaries—and Sex Workers
What all of these tactics share is a focus on intermediaries. Payment processors. Social platforms. Even hotels. One NCOSE-backed lawsuit accuses Wyndham Hotels of sex trafficking for failing to put a stop to prostitution involving a 16-year-old that was taking place in one of its rooms; NCOSE alleges that hotel staff should’ve been suspicious of things like “large quantities of used condoms left in the room” and “excessive requests for sheets and room cleaning services.” Another suit is against the state of Nevada, where some counties allow legal (and highly regulated) brothels.
These suits give you a sense of the NCOSE mission’s scope. The group might claim its only goal is to stop extreme exploitation, not consensual encounters. But this sounds awfully close to wanting constant surveillance of people having sex outside traditional bounds.
Activists have found that they don’t need to directly ban pornography, LGBTQ content creators, sex workers, etc. They just need to portray the commingling of sex and money as “risky” and increase the threat of legal and criminal justice penalties for ignoring those risks.
Shutting down websites that largely traffic in legal and expressive content—and are keen to intervene when this isn’t the case—can raise the profile of a group like NCOSE, which fund-raises off the idea that it’s fighting “human trafficking” rather than images of consensual nudity. But threatening their livelihoods doesn’t always prompt people to quit porn. Sometimes it just makes their working conditions more dangerous. And shutting down centralized platforms doesn’t stop predators from posting illegal or exploitative content. But it does make that content and the platforms hosting it harder for investigators to reach.
“Companies like Mastercard are now accomplices in the disenfranchisement of millions of sex workers, complicit in pushing workers away from independence into potentially more dangerous and exploitative conditions,” says the Free Speech Coalition, an adult industry trade group. The grim irony is that NCOSE may be facilitating real exploitation in the name of stamping it out.
Many sex workers, it’s fair to say, don’t feel like any of these moves actually protect them. “Taking away our platforms does not help sex workers or trafficking survivors,” says Awesome. Sex work advocacy groups, she says, have offered real help in actual instances of trafficking. “Sex workers are the experts on our lives and experiences,” she says. “The [anti-porn activists] aren’t. They rely on fabrications, half-truths, and sensationalized narratives.”
“You know who actually cares the most about trafficking?” Awesome asks. “Sex workers.”
The post The New Campaign for a Sex-Free Internet appeared first on Reason.com.
Founded in 1968, Reason is the magazine of free minds and free markets. We produce hard-hitting independent journalism on civil liberties, politics, technology, culture, policy, and commerce. Reason exists outside of the left/right echo chamber. Our goal is to deliver fresh, unbiased information and insights to our readers, viewers, and listeners every day. Visit https://reason.com