On May 18, two US senators introduced the Digital Platform Commission Act of 2023, a bill that seeks to give powers to a new federal agency that will set up a council to regulate AI, in the context of social platforms.
More precisely, the new body – the Federal Digital Platform Commission – would “rule” on what’s termed as enforceable “behavioral codes,” and among those staffing it will be “disinformation experts.”
The move by the two Democratic senators – Michael Bennet and Peter Welch – seems to have come in concert with congressional testimony delivered by OpenAI CEO Sam Altman, since the bill was presented shortly afterwards, and backs Altman’s idea to form a new federal agency of the kind.
Altman had more thoughts on how all this should work – the new agency, according to him, might be given the power to restrict AI development via licenses or credentialing.
The speed with which the two senators picked up on this to announce their bill may owe to the fact Bennet “only” had to go back and update one he already introduced in 2022. This time around, the proposed legislation has been changed in a number of ways, most notably by redefining what a digital platform is.
The bill wants this definition to also cover those companies that provide content “primarily” generated by algorithmic processes. This is done by proposing that the future Commission be given the authority over how personal information is used in decision-making or content generation, which is thought to specifically refer to tech like ChatGPT.
A statement issued by Bennet justified the push for such regulation by accusing “technology” of corrupting democracy and harming children, while working without proper oversight.
And he doesn’t seem to have much faith in the institution he is a part of, the US Congress. Technology, according to him, is making strides and changing in a way and at a pace that Congress is unable to keep up with.
Then he accuses Congress of essentially only reacting once there is a problem, instead of coming up with focused, “narrow” solutions beforehand.
Nor does he think that the Federal Trade Commission or the Department of Justice can do the job – because they lack both expert staff, and resources to provide “robust and sustained” regulation of the digital platforms sector.
And so a separate federal agency is now needed, he thinks. According to Bennet, the Commission’s job will be to regulate companies in the interest of consumers, competition, but also, to “defend the public interest.”
If passed, the Digital Platform Commission Act would bring in several key points to legislation in the US regulating the field.
The federal commission would have five members, who would be able to organize hearings, investigations, assess fines and establish rules through engagement in public rule-making.
One of the targets quoted in the press release is protecting consumers from “addicting design features or harmful algorithmic processes.”
And in Bennet and Welch’s proposal, all animals are equal – but some, as it were, more so. Thus, the Commission would be allowed to designate some digital platforms as being of systemic importance, and then subject those to extra oversight and regulation – such as audits and “explainability” related to algorithms.
As part of the Commission, the bill proposes establishing a Code Council whose job would be to come up with voluntary or enforceable behavioral codes, technical standards, or other policies – such as transparency and accountability for algorithmic processes.
It is this Council that should have the “disinformation experts” in its ranks.
The Council will have 18 members in all, representing digital platforms or their associations (three of whom must come from what will be designated as systemically important platforms), as well as non-profits, academics – and experts whose focus is on technology policy, law, consumer protection, privacy, competition – and, disinformation.
This hodgepodge of sometimes conflicting concepts and issues present in the description of the Council is also in the very introduction to the bill, where an attempt is made to justify the need to establish the Commission.
Some of the problems it is meant to solve are these: undercutting small businesses, helping destroy “trusted” local journalism, enabling addiction, disseminating disinformation and hate speech, undermining privacy and monetizing personal data, as well as radicalizing individuals, and perpetuating racism.
The post New Democrat Bill Calls For A Federal Agency To Create “Behavioral Codes,” Introduce “Disinformation Experts” appeared first on Reclaim The Net.
Reclaim The Net is a free speech and online privacy organization that defends our individual liberty by pushing back against big tech and media gatekeepers. Much of their work focuses on exposing digital tyrants and promoting free speech and privacy-friendly alternative online services. Visit reclaimthenet.org