Is It TikTok’s Fault When Children Die Making Dangerous Videos?

Fight Censorship, Share This Post!

TikTok app open on a smartphone

Earlier this month, the respective parents of two girls filed suit against social media platform TikTok and its parent company, ByteDance. According to the suit, the girls died while imitating behavior from videos that TikTok knew were dangerous and should have taken down. It asks for a jury trial and unspecified monetary damages.

One of the largest social media outlets on the planet, TikTok features short videos created and submitted by users. As with many platforms, it uses an algorithm to determine what users see. Based on individual users’ demographics and the videos they engage with, the site curates content to display on their “For You Page” (FYP).

Unique among platforms, TikTok features “challenges,” in which users film themselves doing certain tasks or activities and then encourage others to do the same. Typically they will all use a unique hashtag to make the videos easily cataloged.

In this case, the parents allege that their children, an 8-year-old girl and a 9-year-old girl, each died while taking part in the “blackout challenge” in which participants film themselves holding their breath or asphyxiating until they pass out. In fact, in just over 18 months, at least seven children have died after apparently attempting the challenge.

The story is truly tragic. But it’s not clear that TikTok is uniquely responsible, nor that it should be held legally liable.

The parents are being represented in part by attorneys from the Social Media Victims Law Center (SMVLC), which bills itself as “a legal resource for parents of children harmed by social media addiction and abuse.” The lawsuit accuses TikTok of inadequate parental controls and of doing too little to prevent the proliferation of dangerous content. It alleges, “TikTok actively tries to conceal the dangerous and addictive nature of its product, lulling users and parents into a false sense of security.”

The parents refer to the platform’s algorithm as “dangerously defective,” “direct[ing users] to harmful content” in a “manipulative and coercive manner.” Specifically, they say the algorithm “directed exceedingly and unacceptably dangerous challenges and videos” to each child’s FYP.

Of course, children imitating dangerous behavior did not originate with TikTok. In the early 2000s, after the premiere of MTV’s Jackass, multiple teenagers died reenacting stunts from the show. Even the activity at issue is not new: In 2008, the Centers for Disease Control and Prevention (CDC) warned that at least 82 children had died in a little over a decade from what it called “the choking game.”

For its part, TikTok claims that the blackout challenge “long predates our platform and has never been a TikTok trend.” Now, searching the “blackout challenge” hashtag is effectively banned, instead redirecting users to a page about the potential risks of online challenges. In the past, it has removed challenge videos of objectionable content, including activities that weren’t dangerous.

It could be that TikTok did, in fact, know about dangerous content and intentionally allow it to spread. Or it could be that, as TechDirt‘s Mike Masnick says, “Content moderation at scale is impossible to do well.” TikTok has over one billion monthly active users, and more than half have uploaded their own videos—it is not feasible to expect that it could possibly screen all of that content. A study released last month regarding “the lifespans of TikTok challenges” determined, “TikTok removes challenges reported as dangerous and has increased safety controls. However, considering the huge number of users and challenges created every day on this social platform, as well as the usage of some tricks exploited by the authors of dangerous challenges to bypass controls, the risk that dangerous challenges are accessible is real.”

Even if TikTok were directing dangerous content to people’s FYPs, it’s not clear whether the platform could be found legally liable at all. Section 230 of the Communications Decency Act protects online services from legal liability for content posted by its users. While TikTok may host the videos, the law states that it cannot “be treated as the publisher or speaker” of any content provided by a user.

The lawsuit tries to circumvent this restriction by claiming that by offering images, memes, and licensed music that users can incorporate into their videos, “TikTok becomes a co-publisher of such content.” This is a common misreading of Section 230: In fact, there is no distinction between a “publisher” and a “platform” in the eyes of the law. Besides, a platform cannot be the same as a user solely because it supplies the user with tools to make videos.

Every child who is injured or killed imitating a viral video is a tragic and senseless loss. Their parents are certainly suffering unimaginable grief. But it’s not clear that TikTok could, or should, be held individually liable.

The post Is It TikTok’s Fault When Children Die Making Dangerous Videos? appeared first on Reason.com.


Fight Censorship, Share This Post!

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.