Models of Black Motion: On the Predictive Policing of Race

By Dr Georgiana Banita

 

 

Ever since my first engagement with the traumatic relation between race and policing (the conference, Black America and the Police in 2016), I have sought to trace what I consider to be the guiding principle of repressive state action against communities of color: the anticipation of black violence. This period has also overlapped with technology advancements in digitally-driven predictive policing, a patrol strategy widely perceived to amplify already baked-in statistical biases. From Los Angeles to Chicago and New York, algorithms have increasingly superseded conventional policing in ways that arguably help develop, maintain, and expand a digital underclass of docile black subjects. Informed by convergent data sets, software like PredPol, Beware, HunchLab and others build on routine activity theory to map repeat behavior in order to identify a crime event before it occurs or to nanotarget potential criminals. Troubled by this development in law enforcement, criminal justice and other areas, scholars like Ruha Benjamin, Simone Brown, and Safiya Noble have exposed and denounced the ways in which putatively objective predictive technologies in fact risk perpetuating racial prejudice.[1]

This research around the unfairness of Machine Learning and AI technologies involved in predictive methods of social control has been nothing short of a revelation. Yet little attention has been paid to what strikes me as a potentially central dynamic in the criminal analytics of race: the specter of the moving racial subject. As Great Migration scholar Wallace Best put it in 2015, “a black body in motion is never without consequence.”[2] Its presence has historically signified stealth, danger, and predation to white power structures. To the extent that black motion codifies emancipation through unpredictability, it puts pressure on the expectation that, for racially marked subjects, mobility – whether socially upward or lateral in space – will remain elusive. Black Lives Matter, a powerfully disruptive millennial movement committed to undoing the history of state-sanctioned violence against black bodies, has already provoked intense scrutiny of the links between emergent and century-old forms of resistance, from slave uprisings in the antebellum south to the civil rights era. I believe we should also be mindful of one particular dynamic that has triggered disproportionately ferocious responses from police forces and their precursors: the refusal of a black person to stand still and be quiet or, in modern police cant, to “freeze.”

In thinking about the policing of black bodies in motion, perhaps the first thing that comes to mind is the fraught condition known as “driving while black.” Vehicular Terry stops aren’t necessarily prompted by a clear traffic violation or by suspected criminal activity. As legal scholar Richard Posner phrased it in 2005, “Whether you stand still or move, drive above, below, or at the speed limit, you will be described by the police as acting suspiciously should they wish to stop or arrest you.”[3] Any kind of furtive gesture can and will count against what often turns out to be a young black male. So tenaciously entrenched is the image of the “frozen” black body that almost anything can draw the attention of an officer or even startle them into shooting. Here as in the other instances I want to discuss, the perceived threat emanating from a black subject often triggers efforts to proactively anticipate (rather than react to or punish) a criminal act. In other words, there may always have been a “predictive” element in the MO of American law enforcement, resulting in a long-standing institutional restriction of black rights and freedoms.

An effective way to process these repressive strategies historically is to devise a set of behavioral “models” – a term that I derive from algorithmic language but want to weaponize here as a tool to push against the digital standardization of race. What I seek to “model” instead are scenarios that capture different patterns of conspicuous black motion: a fugitive slave, a vagrant worker, former slaves relocating from the rural South to the urban North, a black person in the wrong part of town, Black American crowds marching for civil rights, the black driver who suddenly reaches for his jacket on the back seat, or unsuspecting black populations leading traceable lives that ostensibly lend themselves to mathematical prediction. It’s a reading that aims to be both “distant” in trying to spot large-scale, historical patterns of black mobility and “close” in that it brings into focus details we might have overlooked in the narratives of Nat Turner and Bras-Coupé, Jacob Lawrence’s “Migration Series,” or the writings of Ann Petry, James Baldwin, Ta-Nehisi Coates, and Edwidge Danticat; in the videotaped police beating of Rodney King, body cam videos of fatal police shootings, the cinema of Spike Lee, the rap lyrics of Jay-Z, or Ava DuVernay’s limited series When They See Us.

The final goal is to add software to this archive by demonstrating that the current use of crime prediction technologies expresses a similar anxiety about black bodies in motion. Predictive policing, I suggest, ultimately targets systemically disadvantaged groups marked as poor, black, and delinquent, reinforcing racial prejudices by crafting self-fulfilling prophesies and breathing new life into widely discredited links between race and crime. Taking my cue from the Chicago Police Department’s Strategic Subject List, I want to show that predictive policing recycles archaic models of suspicion around blackness on the move. This time around, the focus of attention is on the likelihood of hundreds or even thousands of digitally-coded black subjects to be prone to violence. The deterrence tactics that SSL can ease or expedite are increased stop and search, warnings, and preventive arrests. Yet the so-called strategic list doesn’t necessarily include hardened criminals, but also perpetrators or victims who have verifiably interacted with police over minor offenses (such as drugs or gambling).

What it all comes down to is how technology is fraying black America’s already fragile relationship with the police; and whether the criminalization of black lives on the basis of only partially accurate police data requires anti-racist, defensive technology or might actually benefit more from looking into the sociocultural concerns that compelled law enforcement to adopt foresight technology in the first place. I will collect my thoughts on this subject in a programmatic essay for a special issue of the Journal of American Studies titled “Technologies of Racialized Prediction,” which I am co-editing with Josh Scannell (New School of Social Research). We would be happy to hear from a diverse range of scholars and welcome proposals for the issue. The CFP can be found here.

 


Dr Georgiana Banita is a research fellow of the VW Foundation at the Trimberg Research Academy at The University of Bamberg. Dr Banita was previously an Assistant Professor at the University of Bamberg in Literature and Media at Bamberg and also a Postdoctoral fellow at the US Studies Centre, University of Sydney. Dr Banita has written extensively on American politics, literature and criminal justice. You can visit her personal website here and follow her on Twitter @GeorgianaBanita

 

[1] Ruha Benjamin, Race after Technology: Abolitionist Tools for the New Jim Code (Cambridge: Polity, 2019); Safiya Umoja Noble, Algorithms of Oppression: How Search Engines Reinforce Racism (New York: New York University Press, 2018); Simone Browne, Dark Matters: On the Surveillance of Blackness (Durham: Duke University Press, 2015).

[2] Wallace Best, “The Fear of Black Bodies in Motion,” Huffington Post, Feb. 3, 2015.

[3] Qtd. in Mark Lamont Hill, Nobody: Casualties of America’s War on the Vulnerable, from Ferguson to Flint and Beyond (New York: Atria Books, 2016), 59.