Alex Chohlas-Wood

Photo credit: Simon Luethi

Hi! I'm the executive director of the Stanford Computational Policy Lab (SCPL). Previously, I was a computational social science PhD student at Stanford, and served as the director of analytics at the New York City Police Department. My work focuses on using technology and data science to support criminal justice reform.

Publications, projects, and press

Disparate impact in policing

In a paper published in the University of Chicago Law Review, my colleagues and I showed how data analysis can identify and quantify racially disparate impacts in police stop practices. I wrote a Twitter thread about the paper here.

March 2022

Blind charging

With colleagues at SCPL, I helped design and implement a blind charging algorithm, now in use by the San Francisco District Attorney. Our tool automatically masks race-related information in incident narratives to reduce the influence of race on charging decisions. Our paper on the project was included in the 2021 Proceedings of the AAAI/ACM Conference on AI, Ethics, and Society.

June 2019 / Updated July 2021

COVID-19 oriented reforms

In a Washington Post piece, I argued (with colleagues from SCPL) that the dramatic—but temporary, and patchwork—criminal justice reforms enacted in response to COVID-19 should be made permanent and expanded across the country.

July 2020

Risk assessment instruments

I wrote a briefer on the potential advantages and drawbacks of risk assessment instruments in criminal justice settings for the Brookings Institute's "AI and Bias" series on fairness in AI.

June 2020

Nashville police department

Our team at SCPL worked with the city of Nashville to demonstrate that traffic stops were an ineffective tool for fighting crime. Since the release of our report, the department has reduced the use of traffic stops by 70%—an almost 90% reduction from their peak.

November 2018


I helped build a tool used by detectives at NYPD to discover groups of related crimes. We describe our approach in a published paper, including our effort to demonstrate the tool is fair. Patternizr was featured in several articles.



I designed an algorithm for the NYPD that looked for crimes which were misclassified as felonies or misdemeanors. Likely misclassifications were sent to an internal team for auditing and correction. I presented my approach at NYU's Tyranny of the Algorithm? Predictive Analytics & Human Rights conference.

April 2016

Contact me

I'm on Twitter. Or, reach out over email.