March 29, 2024
Spread the love

San Francisco has announced a bias mitigation tool that uses basic AI techniques to automatically redact information from police reports that could identify a suspect’s race.

It’s designed to be a way to keep prosecutors from being influenced by racial bias when deciding whether someone gets charged with a crime. The tool will be ready and is scheduled to be implemented on July 1st.

The tool will not only strip out descriptions of race, but also descriptors like eye color and hair color, according to the SF district attorney’s office. The names of people, locations, and neighborhoods that might all consciously or unconsciously tip off a prosecutor that a suspect is of a certain racial background are also removed.

SF District Attorney George Gascón said in a media briefing: “When you look at the people incarcerated in this country, they’re going to be disproportionately men and women of color,”

He also pointed out that seeing a name like Hernandez can immediately tell prosecutors that a person is of Latino descent, potentially biasing the outcome.

“We had to create machine learning around this process,” Gascón said. The district attorney’s office is calling this a first-in-the-nation use of this tech, saying it’s unaware of any agency using AI to do this before.

The tool has been developed by Alex Chohlas-Wood and team at the Stanford Computational Policy Lab, who also helped develop the NYPD’s Patternizr system to automatically search through case files to find patterns of crime.

Wood says the new tool is basically just a lightweight web app that uses several algorithms to automatically redact a police report, recognizing words in the report using computer vision and replacing them with generic versions like Location, Officer #1, and so on.

Wood says the tool is in the final stages, was developed at no cost to SF, and will be open-sourced in a matter of weeks for others to adopt. He says it uses a specific technique called named-entity recognition, among other components, to identify what to redact.

Without seeing the system work on real police reports for legal reasons, the DA’s office said it had to show us a mock-up it’s unclear how well it might work. When a journalist asked if it would redact other descriptions, such as cross-dressing, Gascón could only say today is a starting point and that the tool will evolve. It’s also only used in the first charging decision in a given arrest.

Prosecutors’ final decisions will be based on the full unredacted report. And if the initial charging decision is based on video evidence, that may obviously reveal a suspect’s race as well.

Leave a Reply

Your email address will not be published. Required fields are marked *