In 2012, the Chicago Police Department (CPD) collected data on everyone arrested in the last four years, and used it to create a surveillance system that would supposedly predict violent crime. The system targeted people perceived as having a “high propensity toward violent, gang-related crime” for aggressive enforcement and enhanced prosecution. The Department’s series of risk models and algorithms not only failed to accurately predict violent crime, but endangered Chicagoans’ civil liberties and exacerbated racial disparities.
The program’s failure should serve as an alarm bell in an era where police agencies nationwide increasingly deploy controversial surveillance technology.
A report published on January 23 by the City of Chicago’s Office of the Inspector General (OIG) analyzed the performance of the risk models—known as the Party to Violence (PTV) Program— which the Department officially discontinued on November 1, 2019. Of the 398,684 individuals recorded in one version of the model, only 16.3 percent were confirmed to be members of gangs. One district commander cited in the report who used the system believed, for reasons that are unclear, that the number was 95 percent.
Incorrect data and weak oversight helped to make models unreliable and unethical. People who were arrested but never convicted of a crime remained in the database. Those who were arrested for low-level, nonviolent offenses were also included. Some people who were supposed to have been removed from the program because they hadn’t committed another crime were left in anyway. The training provided to officers on the proper use of the database was minimal. And any Department personnel could access the information—even if their jobs had nothing to do with the program’s stated purpose.
Why were hundreds of thousands of people assigned a status that subjected them to intensive policing and enhanced punishment?
After the fatal shooting of Nykea Aldridge, a cousin of Chicago Bull and NBA star Dwayne Wade, in August 2016, then-Chicago Police Chief Eddie Johnson said: “The Frustrating part is…we have 1,400 individuals that drive this gun violence in this city. This isn’t a mystery.”
If that was the Department’s belief, then why were hundreds of thousands of people assigned a status that subjected them to intensive policing, and to enhanced punishment if they were convicted of an unrelated crime?
How It Was Supposed to Work
The OIG details how the Department built a Strategic Subject List (SSL) and Crime and Victimization Risk Model (CVRM). Its purpose was to assign risk scores ranging from 10-500 to people entered in the system—and then sort them into tiers to determine who would be most likely to be either the victim or offender in a shooting in the next 18 months.
The Department has said in the past that people with risk scores above 250 were subject to scrutiny. Those involved in a shooting—including victims—were referred to as “parties to violence.”
The Department had never released a complete list of the factors used to assign risk scores until this report. Relevant factors, it turns out, included age at latest arrest, arrests for unauthorized use of a weapon, violent incident arrests, trends in involvement in crime incidents, drug arrests and gang affiliation. We still don’t know how much weight was given to each element of the model.
The goal was to track and monitor the behavior of people in the system over time and assess how they responded to different crime control strategies, providing data to inform future approaches. Beginning in March 2015, CPD included information about whether people had risk scores or not in the narrative section of arrest reports.
The theory behind the program was very simple: Those with higher risk scores were more likely to become “parties to violence.” Several researchers of urban gun violence have argued that a small and highly concentrated population is responsible for most shootings in a given city, and that identifying and managing those at risk of offending is the key to reducing violence. This “focus on the violence” strategy was popularized by Thomas Abt’s 2019 book Bleeding Out. Chicago’s former Chief Johnson said he hoped that the risk models would allow the Department to better target their policing of violence.
Researchers from the Illinois Institute of Technology built all six versions of the model and generated risk scores and party-to-violence candidates using de-identified data from CPD’s Information Services Division. CPD then reidentified and matched the data to individuals in the system and created a single “dashboard” of all risk profiles, available Department-wide.
Risk scores and tiers were also plugged into Caboodle, the geospatial tool that the Department uses to map criminal activity—which has itself attracted scrutiny because of its use as a visualization tool for the city’s controversial anti-gang databases.
Racial Disparities in Chicago Policing
Racial disparities in the Chicago Police Department’s practices are stark. The Vera Institute of Justice’s Arrest Trends tool demonstrates that in a population that is 30.1 percent Black, 72.5 percent of all arrests made by the CPD in 2016 were of African Americans. That disparity is higher for violent offenses.
Communities of color in the city felt the brunt of the program’s ill effects.
CPD has stated that race and gender were not determinative factors in the risk models. However, a 2017 Chicago Sun-Times investigation found that 85 percent of those with the highest risk score in the model were African American.
As a result, communities of color in the city felt the brunt of the program’s ill effects. This is true for people whose names should not have been on the Strategic Subject List, and for those who had inflated risk scores and were subject to intensive policing or enhanced enforcement. It’s also true of people who had their privacy violated by having arrest and other personal information exposed to law enforcement personnel who should have never had access to it.
These harms have yet to be quantified. But while we don’t have data to demonstrate specifically how the city’s risk models drove racially biased patterns of enforcement or prosecution, we do know that it was used to contribute to Chicago justice system programs that have been criticized for heightening racial disparities in the past. This is especially true for the role PTV may have played in CPD’s controversial gang policing efforts.
The PTV program could be used as one of 10 potential criteria for admission into the Department’s initiative to identify repeat offenders in gang-involved violent crime: Targeted Repeat-Offender Apprehension and Prosecution (TRAP). The PTV program was also one of four “technical components” of the Department’s Gang Violence Reduction Strategy (GVRS).
Previous OIG reports have documented how Chicago’s gang databases suffered from the same issues as the PTV program, including erroneous admissions, poor data privacy protocols, and racially disparate outcomes—95 percent of people entered in the gang database after an arrest were identified as Black or Hispanic.
A Critical Conversation
A range of violence reduction strategies, including the formation of community outreach groups from impacted neighborhoods, have paid off for Chicago in recent years: In 2019 the city reported 492 murders, a 35 percent decline from an horrific 2016, when the murder total was higher than it had been in more than two decades. More needs to be done: The homicide rate in Chicago is still above cities like New York and Los Angeles.
But in addition to highlighting the harms of Chicago’s predictive policing program, the OIG’s findings cast serious doubt on whether it had any impact on the recent decline in violence. The federal government gave the CPD a $3.8 million grant to implement its predictive policing program in 2012. This money would surely have been better spent on community investment, or violence reduction strategies that are proven to reduce gun violence.
Given the New York Times’s recent bombshell report that exposed the CPD as one of the agencies making use of invasive new facial recognition software run by the firm Clearview AI, a broader conversation about the CPD’s technology and data practices is critical.
The city council and the community must lead this charge. To protect Chicagoans, particularly in the city’s most marginalized communities, transparency and democratic accountability are long overdue.