Abolitionist Scientists Challenge Crime Prediction Software Research

    One thousand scientists have denounced new software designed to predict criminality based off of a single face photo. They’re now demanding that a major publisher halt the publication of the research behind the software.

    On June 23, the Coalition for Critical Technologies—a multidisciplinary group natural scientists, social scientists and technicians who oppose the carceral state—published an open letter to Springer Publishing, an academic health and medicine publisher. It called for the rescindment of the forthcoming article, “A Deep Neural Network Model to Predict Criminality Using Image Processing” by Harrisburg University researchers

    A press release by the University on behalf of the researchers claims that the software’s ability to “predict if someone is a criminal based solely on a picture of their face” has “80 percent accuracy” and “no racial bias.” One of the researchers, Professor Nathaniel Ashby, stated that the team aimed to create a tool for law enforcement and the military that’s “less impacted by biases and emotional responses.”

    The exact design that makes this possible has yet to go public. But for the Coalition, no amount of methodological tweaking can undo the embedded racial bias of such systems. No matter how they cut it, the data from which the researchers draw is capturing the anti-Black racism baked into “who police choose to arrest, how judges choose to rule, and which people are granted longer or more lenient sentences,” the letter states.

    That’s not to mention that facial recognition technology has been shown, even by US government’s own studies, to misidentify Black faces. Just one day after the letter’s publication, Robert Williams, a Black man from Michigan, filed an administrative complaint alleging that the Detroit Police Department unjustly arrested him in January, after its software mismatched his driver license photo with a surveillance camera still of a person stealing watches from a store in October 2018. “This is the first known case of someone being wrongfully arrested in the United States because of this technology,” wrote the American Civil Liberties Union, Williams’s legal representation, “though there are likely many more cases like Robert’s that remain unknown.”

    Even if sincere, the Harrisburg researchers’ hope to remove racism from crime prediction is inherently faulty, the Coalition writes. As the criminal legal system stands, it’s nearly impossible, the scientists imply, to decouple racism and crime prediction.

    The stakes, for the Coalition, are high. “The circulation of this work by a major publisher like Springer would represent a significant step towards the legitimation and application of repeatedly debunked, socially harmful research in the real world,” they wrote, noting that police may lean on research like this, which distances officers’ behaviors and beliefs from the terrorization of Black and Brown communities, in the current moment of unprecedented national scrutiny of law enforcement.

    “The uncritical acceptance of default assumptions inevitably leads to discriminatory design in algorithmic systems, reproducing ideas which normalize social hierarchies and legitimize violence against marginalized groups.”

    The demand to halt the publication of research based on faulty data is one move, for the Coalition signatories, within the broader fight to “Abolish the #TechtoPrison Pipeline,” as the letter was titled.

     


    Graphic of the Coalition’s logo by the Coalition for Critical Technologies via Medium

    • Show Comments

    You May Also Like