As the use of risk assessment to make pre-trial decisions like bail increases across the country, concerns over the tools used and their potential to hinder judicial discretion and perpetuate racial biases have also grown among criminal justice researchers, leaders and organizations.
Jeffrey Clayton, executive director of the American Bail Coalition, points out that computer algorithms have been used during various steps of the trial process, such as sentencing and probation, for decades, but they largely flew under the radar.
“Now that we think we could accumulate more data and we’re in a computer era, people think they are more predictive than maybe they did a few years ago,” Clayton said.
One pre-trial risk assessment tool that is gaining momentum is the Public Safety Assessment, launched by the Laura and John Arnold Foundation in 2013, to aid courts in making release or detention decisions. According to the foundation, the PSA is currently used in about 40 cities, counties and states, and more than 600 jurisdictions have expressed an interest in implementing the tool.
The PSA examines nine factors – age at current arrest, current violent offense, pending charge at the time of the offense, prior misdemeanor conviction, prior felony conviction, prior violent conviction, prior failure to appear in the past two years, prior failure to appear older than two years and prior sentence to incarceration – to calculate two risk scores. One predicts the risk of a defendant failing to appear for future court appearances and the other predicts the risk of the defendant committing a new crime if released before trial.
A spokesperson for the Laura and John Arnold Foundation was unable to provide comments for this article.
Clayton contends that tools like the PSA have been posited as a replacement for bail, and in states like New Mexico and New Jersey, that’s what has happened – their systems are built on recommendations from algorithms.
Earlier this year, New Mexico Gov. Susana Martinez voiced concerns in an online video over her state’s implementation of the PSA and urged neighbor-state Utah to consider the “devastating results” before using the tool.
“The whole theory is that we can figure out who is dangerous and who’s not,” Clayton said. “We can keep the dangerous ones in, and let the non-dangerous ones out.”
“And they’re not challenged,” he added. “I just sat in the first appearance center in Albuquerque a couple of months ago, and the judge followed the recommendation of all felonies charged except for one case. They are ruling the day, in my view, and intruding on judicial discretion.”
Eric Siddall, vice president of the Association of Deputy District Attorneys in Los Angeles, agreed that while tools like the PSA should be viewed as informative, many in his organization fear that judges will use them as their only source of information.
“We really, more than anything else, have concerns that it is used as a crutch for judges rather than them utilizing their individual judgment and experience,” Siddall said.
Christopher Griffin, a visiting legal professor and research scholar at the University of Arizona, oversees PSA evaluation studies as part of his role with The Access to Justice Lab at Harvard Law School. In the past year-and-a-half, he has visited several counties in Wisconsin, Iowa and Utah to evaluate the PSA.
In his experience, he says, the PSA has been seamlessly implemented in a number of study sites. When the tool is used to its full extent, the judge should have more information at his or her disposal, but so should the prosecutors and defense attorneys when making their arguments.
“It is now heavily embedded in their criminal procedure, it is something they expect to hear about, how some of the risk factors are scored for the arrestee,” Griffin said. “It’s just part of the way these sites are now doing business.”
Bernard Harcourt, a professor of law and political science at Columbia University, has done extensive research on the history and operation of risk assessment tools in the criminal justice system. He says the largest problem he has seen is that they, without fail, have a racial skew.
Originally, Harcourt explains, race was a specific risk factor used in the tools, up until the 1970s. Even after it was deemed unacceptable to use race, he says, other risk factors, such as prior arrests and convictions, continued to correlate heavily with a racial skew.
“Given that our policing in this country tends to use racial profiling, those indicators tend to skew on race,” he said. “So it’s not surprising that a lot of studies that have looked at these questions have found what’s called a machine bias.”
In July, more than 100 civil rights and community-based organizations released a shared statement highlighting concerns with the adoption of pre-trial risk assessment tools. The groups, which include the National Association for the Advancement of Colored People Legal Defense and Educational Fund and American Civil Liberties Union, contend that risk assessment tools are not a “panacea” to reforming unjust bail systems, and instead, could increase racial disparities.
“Pretrial detention reform that addresses the injustice of people being jailed because of their poverty is urgently needed, but substituting risk assessment instruments for money bail is not the answer,” Monique Dixon, deputy director of policy and senior counsel for the NAACP Legal Defense and Educational Fund, said in the statement. “Biased policing practices in communities of color result in racial disparities in the data risk assessment tools rely on, making Black and Brown people look riskier than White people.”
Harcourt adds that another problem with risk assessment tools is that they “don’t really predict what we think they predict.” Rather than predicting who is committing a crime, he explains, they are predicting who will be arrested or convicted of criminal activity.
“Tragically, most people in society and even most people in the criminal justice system don’t understand the difference,” he said.
In addition to judicial discretion and racial bias issues, there are concerns over the validation and regulation of pre-trial risk assessment tools.
Siddall refers to a July 2017 case in San Francisco involving 19-year-old Lamonte Mims, who was arrested for gun possession and parole violations but released when the judge followed the recommendation of his PSA score. Mims, who was already on felony probation for an earlier crime, was allegedly involved in the murder of a 71-year-old man five days later.
When asked how something like that could happen, Siddall says he isn’t sure, because “the other problem is that there is not a lot of transparency.”
Clayton contends that best practices call for revalidation within 18-24 months, but in his experience, once a tool is implemented, it is not evaluated to make sure it is working or producing accurate results.
Or, he adds, even if tools are evaluated, results have not often been shared publicly.
Griffin explains the difficulty with providing results of his own PSA studies – even a year and a half into his experiment, he is still working to bring cases into the data that will be analyzed once those cases all come to an end.
“We won’t have a final data report until 2022,” Griffin said. “When it comes to any conclusions about the effectiveness of the tool, we have so little data at our disposal and don’t make any statements about that. It is far too premature.”
Harcourt expects the use of risk assessment tools to continue to increase in the next few years, largely because they make it easier to redirect the responsibility for making tough decisions.
“No one wants to be responsible for the errors, particularly, parole boards or criminal justice decision makers who could be responsible for the subsequent crime,” he said. “So using these tools is a nice, convenient way for humans to insulate themselves from the consequences.”