risk, algorithm, equal protection, machine learning
Civil Rights and Discrimination | Constitutional Law | Law | Law and Society
States have increasingly resorted to statistically-derived risk algorithms to determine when diversion from prison should occur, whether sentences should be enhanced, and the level of security and treatment a prisoner requires. The federal government has jumped on the bandwagon in a big way with the First Step Act, which mandated that a risk assessment instrument be developed to determine which prisoners can be released early on parole. Policymakers are turning to these algorithms because they are thought to be more accurate and less biased than judges and correctional officials, making them useful tools for reducing prison populations through identification of low risk individuals.
These assumptions about the benefits of risk assessment tools are all contested. But critics also argue that, even if these instruments improve overall accuracy, they are constitutionally suspect. While no instrument explicitly uses race as a “risk factor” (which in any event is probably barred by the Supreme Court’s decision in Buck v. Davis), several do incorporate sex (with maleness increasing the risk score) and many rely on factors that are highly correlated with race or socio-economic status, which is said to violate equal protection principles.
In Sex, Causation and Algorithms, Deborah Hellman, a philosopher and constitutional law scholar, provides some provocative food for thought on this issue. The article focuses on the Supreme Court’s Fourteenth Amendment caselaw on sex as a classification. But the approach to equal protection that Hellman develops could also provide a response to many of the other discrimination and disparate impact challenges aimed at risk assessment instruments.
Reconciling Risk and Equality Jotwell.
Available at: https://scholarship.law.vanderbilt.edu/faculty-publications/1172