Dice Question Streamline Icon: https://streamlinehq.com

Distribution shift in predictive AI for legal decision-making

Develop methods to detect, quantify, and mitigate distribution shift when deploying predictive machine learning systems for legal decision-making, such as pretrial risk assessment and recidivism prediction, to ensure models trained on national datasets remain valid for specific local jurisdictions with differing base rates.

Information Square Streamline Icon: https://streamlinehq.com

Background

The paper discusses failures of predictive AI tools in criminal justice due to differences between training populations and deployment contexts, citing the Public Safety Assessment (PSA) in Cook County where local violent recidivism rates differ greatly from national training data.

It frames distribution shift as a core, open research challenge that affects most predictive applications where the deployment population differs from training data, undermining accuracy and reliability.

References

Distribution shift is an open research problem in machine learning, and affects most predictive AI applications where the population of interest differs from training data.

Promises and pitfalls of artificial intelligence for legal applications (2402.01656 - Kapoor et al., 10 Jan 2024) in Section “AI for making predictions about the future,” subsection “Predictive AI for making decisions”