103 Iowa L. Rev. 303 (2017)
Download PDF

Abstract

Criminal law scholars devote substantial research to sociological and behavioral studies to determine characteristics common among reoffenders. This research aligns with a massive effort to reform the criminal justice system by reducing recidivism as a means to cure high crime rates and overcrowded prisons. Many scholars believe that by focusing resources on the criminal population that will likely commit future crimes, overall crime rates will decrease. The effort to reduce recidivism has led to the creation of objective risk assessment tools. These are essentially algorithms that purport to predict the likelihood that an individual will commit crime in the future. While these predictive algorithms were first implemented to determine parole conditions, they have become increasingly popular among courts and are now routinely used in all phases of a criminal proceeding. As the demand for predictive risk assessment formulas increases, many state governments now look to private companies to develop these methods. However, the move towards privatization raises issues of transparency, as companies are able to maintain the secrecy of their algorithms by claiming trade secret protection. As a result, defendants are unable to ensure the accuracy of the risk score results. This Note argues that private companies who benefit by providing a public service should be held to the same transparency requirements as public agencies, and freedom of information disclosure requirements should be extended to include proprietary predictive algorithms to achieve this result.

Published:
Wednesday, November 15, 2017