Artificial Intelligence in Judicial Decision Making
When Chief Justice John Roberts visited Rensselaer Polytechnic Institute in April, he was asked what seemed like a question straight from the realm of science fiction by the Institute’s president. The college president asked if he could foresee a day when machines driven by artificial intelligence will assist with judicial decision making. The Chief Justice replied that he did not need to imagine a time when artificial intelligence was used in judicial decision making; it’s already here.
Sentenced to Prison Based on a Private Company’s Software
The Chief Justice may have been thinking about a case of a Wisconsin man, Eric Loomis, who was sentenced to six years in prison based on a private company’s software program when he answered the president’s question. Mr. Loomis is now arguing his due process rights were violated when the judge used the report generated by the software’s proprietary algorithm which Mr. Loomis was not able to inspect or challenge. In a sign that the justices were interested in the issues presented by Mr. Loomis’ case, they requested the federal government file a friend of the court brief to offer its view on whether the court should hear his appeal.
The report was generated by a product called Compass, and the report included a series of charts that assessed the risk of recidivism. In other words, how likely Mr. Loomis was to go on to commit additional crimes. The report and the judge both determined that Mr. Loomis had a high risk for recidivism. During sentencing the judge stated, “you’re identified, through the Compass assessment, as an individual who is a high risk to the community.” The Wisconsin Supreme Court ruled against Mr. Loomis holding he would have gotten the same sentence using the usual factors. However, the Court did seem to express concern about the use of a secret algorithm to sentence an individual to prison.
In a dissenting opinion, one justice expressed concerns about how the algorithm may treat race as a factor in sentencing. She cited a study by ProPublica on the software’s use in Broward County, Florida that found black defendants “were far more likely than white defendants to be incorrectly judged to be at a higher risk of recidivism.” While the use of race in sentencing is not a new concept, the software’s secrecy raises a new concern, because the proprietary software does not explain how race is used to as a factor or how significant a weight it is given by the algorithm.