In a win for transparency, a state court judge ordered the California Department of Corrections and Rehabilitation (CDCR) to disclose records regarding the race and ethnicity of parole candidates. This is also a win for innovation, because the plaintiffs will use this data to build new technology in service of criminal justice reform and racial justice.
In Voss v. CDCR, EFF represented a team of researchers (known as Project Recon) from Stanford University and University of Oregon who are attempting to study California parole suitability determinations using machine-learning models. This involves using automation to review over 50,000 parole hearing transcripts and identify various factors that influence parole determinations. Project Recon’s ultimate goal is to develop an AI tool that can identify parole denials that may have been influenced by improper factors as potential candidates for reconsideration. Project Recon’s work must account for many variables, including the race and ethnicity of individuals who appeared before the parole board.
Project Recon is a promising example of how AI might be used to identify and correct racial bias in our criminal justice system.
In September 2018, Project Recon requested from CDCR race and ethnicity information of parole candidates. CDCR denied the request, claiming that the information was not subject to the California Public Records Act (CPRA). Instead, CDCR shuttled the researchers through its discretionary research review process, where they remained in limbo for nearly a year. Ultimately, the head of the parole board declined to support the team’s request because one of its members had previously published research critical of California’s parole process.
In June 2020, EFF filed a lawsuit on behalf of Project Recon alleging that CDCR violated the CPRA and the First Amendment. Soon after, our case was consolidated with a similar case, Brodheim v. CDCR. We moved for a writ of mandate ordering CDCR to disclose the race data.
In its opposition, CDCR claimed it was protecting the privacy of incarcerated people, and that race data constituted “criminal offender record information” and was therefore exempt from disclosure. EFF pointed out that the public interest in disclosure is high—especially since racial disparities in the criminal justice system are a national topic of conversation—and thus was not outweighed by the public interest in nondisclosure. EFF also argued that race data could not constitute “criminal offender record information” since race has nothing to do with someone’s criminal record, but rather is demographic information.
The court agreed. It reasoned that the public has a strong public interest in disclosure of race and ethnicity data of parole candidates:
[T]his case unquestionably involves a weighty public interest in disclosure, i.e., to shed light on whether the parole process is infected by racial or ethnic bias. The importance of that public interest is vividly highlighted by the current national focus on the role of race in the criminal justice system and in American society generally . . . . Disclosure insures that government activity is open to the sharp eye of public scrutiny.
Accordingly, the court ordered CDCR to produce the requested records. Last week, CDCR declined to appeal the court’s decision and produced the records.
Apart from being a win for transparency and open government, this case also is important for racial justice. As we identified in our briefing, CDCR has a history of racial bias, which the U.S. Supreme Court and California appellate courts alike have recognized. That makes it all the more important for information about potential racial disparities in parole determinations to be open for the public to analyze and debate.
Moreover, this case is a win for beneficial AI innovation. In a world where AI is often proposed for harmful and biased uses, Project Recon is an example of AI for good. Rather than substitute for human decision-making, the AI that Project Recon is attempting to build would shed a light on human decision-making by reviewing past decisions and identifying where bias may have played a role. This innovative use of technology to identify systemic biases, including racial disparities, is the type of AI use we should support and encourage.
The Electronic Frontier Foundation is the leading nonprofit organization defending civil liberties in the digital world. Founded in 1990, EFF champions user privacy, free expression, and innovation through impact litigation, policy analysis, grassroots activism, and technology development. We work to ensure that rights and freedoms are enhanced and protected as our use of technology grows. Visit https://www.eff.org