Every year, Congress must follow through on an enormous and complicated task: agreeing on how to fund the government for the following year. The wrangling over spending often comes down to the wire, and this year, some Senators are considering shoehorning a controversial and unconstitutional bill, the Kids Online Safety Act (KOSA), into the must-pass legislation. Make no mistake: KOSA is bad enough on its own, but putting KOSA into the “omnibus package” is a terrible idea.
Amendments Aren’t Enough
The bill’s sponsors have made last-minute changes to the bill in an attempt to assuage concerns, but these edits don’t resolve its fundamental problems. We’ve spoken about the harms KOSA will cause at length, and they remain in the current version.
To recap: KOSA’s main provision contains the vague requirement that online services act “in the best interests of a user that the platform knows or should know is a minor,” by taking “reasonable measures” to prevent and mitigate various enumerated harms. These harms include mental health disorders, including (to name a few) the promotion or exacerbation of suicide, eating disorders, and substance use disorders; physical violence, online bullying, and harassment of the minor; and sexual exploitation and abuse.
KOSA’s latest text still contains this glaring and unconstitutional flaw at its core.
There is no doubt that this content exists on the internet and that it can be harmful. But as we’ve written, there is no way a platform can make case-by-case decisions about which content exacerbates, for example, an eating disorder, compared to content which provides necessary health information and advice about the topic. As a result, services will be forced to overcensor to ensure young people—and possibly, all users, if they aren’t sure which users are minors—don’t encounter any content on these topics at all.
KOSA’s latest text still contains this glaring and unconstitutional flaw at its core. That’s why we continue to urge Congress to not pass it. And Senators should not consider the largest change—the addition of a new “limitations” section—a solution to any of the bill’s problems. The new language reads:
Nothing in subsection (a) shall be construed to require a covered platform to prevent or preclude any minor from deliberately and independently searching for, or specifically requesting, content.
The new “limitation” section is intended to wave away KOSA’s core problem by giving online services an out. But instead, it creates a legal trap. The bill still creates liability for any service that delivers the content to the user, because at root, if the service is aware the user is a minor—or should know, in the language of the bill—the service is still on the hook for any content presented that is not in the user’s “best interest.” This new language just begs the question: How can a site provide general information that a minor “deliberately and independently” searches for or finds on their services without that site then acquiring some knowledge that minors are looking at the information? Simply put: They cannot. This language just puts a fig leaf over the problem.
One interpretation of the new “limitation” could be that KOSA creates liability only when platforms deliver the content to minor users who aren’t seeking it, rather than those who are seeking it. This may be a subtle way to go after “algorithmically-presented” content, such as videos recommended by YouTube. But that’s not what the law actually says. Who knows what “independently” means here? Of what or whom? If a minor finds a site through a search engine, or if a covered platform provides a list of URLs for minors, was that an “independent” search? A law that claims it doesn’t censor content because people can still search for it is fundamentally flawed if the bill also says that services cannot deliver the content.
In its latest form, this bill still lets Congress—and through the bill’s enforcement mechanism, state Attorneys General—decide what’s appropriate for children to view online. The result will be an internet that is less vibrant, less diverse, and contains less accurate information than the one we have now.
Removal of this content is not an abstract harm. At this moment, hospital websites are removing truthful information about gender-affirming healthcare due to both political pressure and legislation. Libraries are removing books about LGBTQ topics. Given what we know about how certain Attorneys General already have animus against online services and children seeking gender-affirming care, this seems like a terrible time to hand them this additional power. Forcing online services to heavily moderate content that could be construed as contributing to physical violence or a mental disorder, and leaving that enforcement up to either the Federal Trade Commission or state Attorneys General, will disparately impact the most vulnerable, and will harm children who lack the familial, social, financial, or other means to access health information from other places.
1st Amendment Concerns
A fundamental principle in First Amendment law is that the government can’t indirectly regulate speech that it can’t directly regulate. Congress can’t pass a law prohibiting children from accessing information about eating disorders, and similarly, Congress can’t impose liability on services that host that speech as a means of limiting its visibility to children. In its latest form, KOSA still creates liability for hosting legal content that the bill defines as one of the enumerated harms (suicide, eating disorders, etc), and permits enforcement by the FTC and state Attorneys General for violations for services that don’t take whatever “reasonable measures” necessary (a vague standard) to prevent and mitigate children’s exposure to that information. This is unconstitutional.
Not only will KOSA endanger the ability of young people to find true and helpful information online, but it will also interfere with the broader public’s First Amendment right to receive information. The steps that services will take to limit this information for minors is likely to limit access for all users, because many services will not be in a position to know whether their users are children or adults.
This is not a simple problem to solve: Many services that deliver content do not know the age of their users, especially those that are not social media platforms (and even some of those, like Reddit, do not ask for personal information such as age upon signup, which benefits user privacy). Assuming that they do know, or that they should know, fundamentally misunderstands how these services operate. KOSA would require that services must either learn the age of their users and be able to defend that knowledge in court, or remove any potentially offending content. Again, no platform can reasonably be expected to make intelligent and sweeping decisions about which content promotes disordered behavior and which ones provide necessary health information or advice to those suffering from such behaviors. Instead, most platforms will err on the side of caution by removing all of this content entirely.
Additionally, there is a significant benefit to anonymity and privacy online. This is especially true for certain individuals in vulnerable minorities, and in instances in which anyone is looking up certain sensitive information, such as about reproductive rights or LGBTQ topics. Young people in particular may require anonymity in some of their online activity, which is one reason why the “parental supervision” elements of KOSA are troubling.
Better Options Exist
The latest amendments to KOSA should not give Congress cover to include it in the omnibus spending bill. It is a massive overreach to include— without full discussion and within must-pass legislation—a law that requires web browsers, email applications and VPN software, as well as platforms like Reddit and Facebook, to censor an enormous amount of truthful content online.
While we understand the desire for Congress to do something impactful to protect children online before the year’s end, KOSA is the wrong choice. Instead of jamming sweeping restrictions for online services into a must-pass spending package, Congress should take the time to focus on creating strict privacy safeguards for everyone—not just minors—by passing legislation that creates a strong, comprehensive privacy floor with robust enforcement tools.
The Electronic Frontier Foundation is the leading nonprofit organization defending civil liberties in the digital world. Founded in 1990, EFF champions user privacy, free expression, and innovation through impact litigation, policy analysis, grassroots activism, and technology development. We work to ensure that rights and freedoms are enhanced and protected as our use of technology grows. Visit https://www.eff.org