Thursday, June 13, 2019

Is Race "Baked" into the Criminal Justice System?

Today, San Francisco D.A. George Gascón announced a new program: race-blind prosecutions, aided by machine learning. The San Francisco Chronicle reports:
“When I first became district attorney, one concern was to understand how the criminal justice system impacts people of color disproportionately,” Gascón said. “I wanted to see if there was anything in our practice that we could improve.”
The district attorney decided to reach out to the Stanford Computational Policy Lab, which already had many of the tools available to help create the artificial intelligence.
Racial disparities in San Francisco’s criminal justice system are driven by downstream factors like arrests, Gascón said, and his office tries not to exacerbate the disparities. Even so, he wanted to remove any possibility for implicit bias in his office to ensure “the purity of the decision isn’t questionable.”
The system, Gascón said, will create a model that other prosecuting agencies around the country can use, and Stanford has agreed to publicly release the technology at no cost.
The technology organizes a police report and automatically redacts the race of the parties involved in the incident. It also scrubs the names of officers, witnesses and suspects, along with locations and neighborhoods that could suggest a person’s race.
In the complicated world of artificial intelligence, the technology is relatively simple, said Alex Chohlas-Wood, deputy director of the Stanford Computational Policy Lab. It uses pattern recognition and Natural English Processing to identify which words in a police report should be redacted and fills them in with a general description.
The digital tool uses machine learning, so it can make decisions without human intervention.
The district attorney’s office will start using the tool in the 80% of cases that come in through general intake. Cases like homicides, domestic violence and other specialized units will not immediately use the technology.
During the first review process of general intake cases, prosecutors do not look at evidence like videos or pictures that would reveal a person’s race. The case then goes to a second review where a prosecutor makes a decision on whether the evidence is strong enough to move forward with charges.
If a prosecutor decides to reverse a charging decision between the first and second review, when they will likely learn the race of the parties, he or she will have to document the reason why it’s justified in a report, Gascón said.
The tool, he said, will help streamline charging decisions by expediting the ability to review police reports and quickly analyze the information.
The New York Times adds:
The only information prosecutors will initially have access to is an officer’s incident report, which generally includes the reason someone was stopped before an arrest, evidence that a crime was committed, witness statements and anything a suspect might say.
Only after assistant district attorneys make a preliminary decision about charges would they be permitted to access other information, including race and other demographic details, body camera footage and photos. In each case, regardless of the initial charging determination, all of the evidence will ultimately be reviewed, prosecutors said. If a prosecutor comes to a different conclusion between the first and second steps, that will be recorded and compared to historical data. Prosecutors will also be required to explain what changed their minds, and those patterns will be studied, the office said.
The decision to try and examine whether biases are at the root of differential charging rates for different races is laudable, but I suspect it will do far better at catching explicit than implicit bias. Here's why.

It's true that, in some cases, race works in isolation to create a mental picture of the situation. This is proven by the classic audit study, in which identical resumés are submitted for a job, with only the name of the applicant changed ("Lakisha" or "Jamal" versus "Emily" or "Greg"). There are now enough studies of this ilk to show that the name alone impacts employability. But note that the application does not reveal the applicant's race; the name implies the applicant's race. Similarly, in a race-blind prosecution, even with the names and locations removed, prosecutors are likely to implicitly (or explicitly) deduce the race of the suspect from the circumstances of the offense.

The reason for this is simple. As research has shown since the 1960s, part of acquiring professional expertise as a prosecutor or a defense attorney consists of developing sociological "scripts" of the typical ways in which crimes are committed. This means, for example, that particular types of burglaries might suggest to a criminal lawyer that the suspect is probably addicted to drugs, and that particular scenarios of sexual assault might suggest to a prosecutor that the victim is probably a sex worker.

It is not difficult to imagine race playing out as one of the factors an experienced prosecutor or public defender will deduce from ostensibly race-blind facts. To illustrate this, think about the 100:1 crack/powder cocaine sentencing disparity (now diminished to 18:1 through the Fair Sentencing Act). One of the main arguments against the disparity was that the seemingly neutral rule, which simply targeted the type of drugs used, had the effect of disadvantaging African American defendants. Why? because people of different races had different patterns of using drugs. The association of African Americans with crack cocaine and white Americans with powder cocaine is not just stereotypical--it is factually true often enough that a stereotype can confidently build: in an extensive study of four cohorts (2009–2012) of the National Survey on Drug Use and Health (NSDUH) in all 50 states and District of Columbia, "[w]hile blacks were at particularly low odds for powder cocaine use (AOR=0.33, , before controlling for other factors, blacks were actually at increased risk for crack use." Use fancy Stanford computers to remove race and location of drug using suspects, and experienced San Francisco prosecutors will still assume that the crack user is black and the powder user is white--and what's more, most of the time they will be correct.


This ability to imply a person's race from the circumstances surrounding the crime goes beyond cocaine, though it does not always reflect reality. People often assume that serial killers are predominantly white (they're wrong in the sense that African Americans are overrepresented by a factor of 2:1 among serial killers; but if you run into a serial killer in an alley, he's still more likely to be white, simply because most Americans are white). People often assume that child molesters are predominantly white and that rapists are predominantly black (the realities are much more complicated). Marijuana arrests tend to target black neighborhoods and populations, though the realities of who uses marijuana are much more complicated. It is unlikely that a prosecutor will assume that a gang shooting over a drug dispute involves white suspects, and she will often enough (but not always) be right.  In other words, racialized perceptions are baked into the sociological narratives of crime that the culture feeds us, and prosecutors and defense attorneys are no exception. Assumptions about the race of crime perpetrators (and, for that matter, victims) are not always borne by empirical evidence, but they are true often enough that prosecutors will start making generalizations, and redacted names are not going to make these generalizations go away.

Moreover, redacting neighborhoods is not going to make much of a difference, because county prosecutors practice law in an area they are already familiar with from previous cases. Go to a D.A.'s office in any town and ask where street crimes are predominantly committed. Odds are the prosecutors will be able to pinpoint particular neighborhoods in which things happen--that's how the police addresses "hot spots", and that's how street-based sex workers know where they might find clients and where police raids are likely to occur. Criminal procedure students know that a "high crime area", which has special rules about "reasonable suspicion", is often a high-arrest area, a high-poverty area, and a high-people-of-color area (this is partly why underenforcement and overenforcement often go together). In the context of San Francisco, the fancy Stanford machine can remove the location of a gang shooting from the facts of the case, and still the D.A.'s office is unlikely to assume that it happened in Noe Valley or Pacific Heights.

In short, race, racialized behavior, and racialized assumptions about behavior are so deeply embedded into the American fabric that it is hard to imagine any process that strips race and location from a scenario without eliminating the basic facts of the scenario. The very facts and circumstances of a crime form a picture in the prosecutor's mind, and because prosecutors live in our very racialized society and are, like all of us, a product of our very racialized culture, the picture is likely to include race. Not because anyone is racist--or at least, no more or less racist than the rest of us--but because that is how heuristics and biases work. Whether this is an interesting aspect of cultural diversity or an unfortunate byproduct of differential opportunity structures depends on the context. But what it means is that this well-intended measure will not capture, or remedy, the natural tendency to make racial assumptions.

No comments: