“Extremist content is flourishing on TikTok,” read a Politico headline today. The USA Today headline was even more breathless, calling the popular video app a “‘hatescape’ for racism and white supremacy.” Both articles concerned a new report from researchers at the U.K.’s Institute for Strategic Dialogue (ISD) think tank.
A bulleted list atop the USA Today piece trumpets a dramatic finding: “Of the 1,030 TikTok videos researchers analyzed, nearly a third amplified white supremacy.” These videos “included support for genocide conspiracy theories that claim white people’s existence is under threat and music from white power bands,” the article goes on to state. “Three of the 10 most popular videos, viewed a combined 3.5 million times, were clips originally produced by Paul Miller, an extremist known as ‘Gypsy Crusader’ who spreads racist and antisemitic rhetoric on social media.”
“White supremacist videos were by far the largest category of content the study uncovered,” USA Today adds.
All of this gives the impression that researchers simply opened TikTok and were bombarded with an overwhelming amount of white supremacist content. But click through to the ISD report and you’ll notice an important and overlooked fact: The researchers were only analyzing videos deemed to contain extremist content.
That is: the 1,030 TikTok videos studied were not some random sample. They were videos purposefully chosen because they were associated with at least one of “157 keywords associated with extremist individuals, groups, ideologies and related incidents or events.”
“ISD examined TikTok accounts identified by these keyword searches to assess the presence of videos, comments or profiles that featured support for extremist individuals, groups or ideologies. Through this method we identified 177 TikTok accounts,” the report states. “ISD found that such accounts typically follow or are followed by other accounts that share their ideological interests. ISD therefore used a snowball methodology to expand the sample of accounts featuring relevant hateful and extremist content, yielding 1,030 videos from 491 TikTok accounts during our data collection period (4-30 June 2021).”
When USA Today refers to “nearly a third” of TikTok videos that “amplified white supremacy,” it’s not one-third of a representative selection of 1,000-plus TikTok videos but one-third of a cache of videos specifically chosen because they contained hateful content.
“This was no random sample at all. Rather, the supposed sample was *designed* to find extremist stuff,” pointed out Mark Pitcavage, a senior research fellow with the Anti-Defamation League’s Center on Extremism, on Twitter. “So, in other words, the authors of this report created a methodology designed to find extremist content and then revealed that the results of doing so produced–you guessed it–extremist content.”
The report tells us nothing about the prevalence of extremist and/or white supremacist content in the overall TikTok landscape, nor how frequently the average person using the app might see such content. This is important context both the USA Today and Politico pieces overlook in their rush to paint TikTok as a “‘hatescape.”
Founded in 1968, Reason is the magazine of free minds and free markets. We produce hard-hitting independent journalism on civil liberties, politics, technology, culture, policy, and commerce. Reason exists outside of the left/right echo chamber. Our goal is to deliver fresh, unbiased information and insights to our readers, viewers, and listeners every day. Visit https://reason.com