Failure No. 1: Risk Assessments Were Rushed to Use Without Proper Understanding


Failure No. 1:  Risk Assestments Were Rushed to Use Without Proper Understanding

A risk assessment uses an algorithm to make predictions about certain groups.  In the area of pretrial risk assessments, the algorithm was touted as the solution to making pretrial release determinations.  The representation was that risk assessments could accurately predict whether an individual is a low, medium or high risk for failing to appear or at risk of re-offending.  The hope or the promise for these tools was that they would allow courts to automatically and quickly determine the status of an individual and more quickly and more accurately make release decisions.  

However, the promises of what would be achieved by these pretrial risk assessments never met the real world reality of what they represent.  Risk assessments have not stood the test of time.  The algorithm that is the basis of the pretrial risk assessment tool was intended for a completely different purpose which involved predicting what "groups" would do not individuals.  Additionally, the largest technology companies in the world have reviewed the tools and issued a joint statement strongly attacking their use and concluding that they should not be a part of any criminal justice reform.  Therefore, the initial strong support for risk assessments has been eroded and now almost all the groups that initially supported their use now oppose the use of pretrial risk assessment tools.

In the last two years, there have been multiple peer reviewed studies released that have analyized the use of pretrial risk assessment tools.  These studies have consistently come to the conclusion that risk assessment tools should not be a part of criminal justice reform.  These studies culminated in 27 leading Academics, representing some of the most prestigious Universities in the country, to sign a statement of concern in July 2019 stating that the identified problems with risk assessments could not be remedied and that these tools should not be used as a part of criminal justice reform.

In 2019, a  number of the largest technology companies in the world including Google, IBM, Apple and Amazon, announced jointly that they opposed the use of pretrial risk assessment tools in the criminal justice system. In reviewing the tools, the statement concluded that these tools  were pushed to use without sufficient testing or study.  The statement demonstrated that the algorithm is very good at predicting what groups will do which is what it was designed to do.  However, the algorithm did not transition well to predicting what an individual will do.  The reason for this is that in predicting what a group will do, the algorithm takes into consideration the racial make up of the group.  Therefore, when the same algorithm is used to attempt to predict what an individual will do, the tool takes into consideration what the racial group of the defendant in predicting the future actions of the defendant.  Since the algorithm relies upon the race of the defendant, the algorithm utilizes the same historical data that documents heavier incarceration rates for one race over others.  As a result, the algorithm continues the same racial bias that proponents were seeking to end.

With the avalanche of evidence against the algorithms gaining steam, virtually all of the advocates who supported risk assessments began to withdraw their support.  In 2019, the Pretrial Justice Institute said it was wrong and apologized for its advocacy of the discredited tools.  It also withdrew its support of the Arnold Foundation, parting ways with the organization that was behind the creation of one of the most widely-used tools in the country.

The McArthur Foundation and Vera Institute also reversed course, with the latter stating that it now believed New Jersey was an example of "failed" bail reform because it had employed risk assessments.

The Four Failures of Risk Assessments:





Comments

Most Read Posts Over The Last 30 Days

PBT Announces Special Guest for Upcoming Annual Meeting

Episode No. 60- Conquering the Digital Beast with Guest District Clerk Jon Gimble

FBI Fudging Crime Numbers?

New York Mayor Says His State Should Rethink Bad Bail Reform

Stories by Topic:

Show more