Advertisement

Nationwide Calls For Police Reform Must Examine Policing Technologies

09:42
Download Audio
Resume
Activists demonstrate in front of a mobile police facial recognition facility in London on Feb. 11, 2020. (Kelvin Chan/AP)
Activists demonstrate in front of a mobile police facial recognition facility in London on Feb. 11, 2020. (Kelvin Chan/AP)

Some advocates for law enforcement reform are calling for a closer examination of policing technology as protests against police brutality continue across the country in the wake of George Floyd’s killing.

Amazon recently announced a one-year moratorium on police agencies using its facial recognition software. Other tech companies including Microsoft and IBM have also placed limits on law enforcement’s use of facial recognition.

Police use of facial recognition is just one of the many ways policing occurs “without the flesh and blood presence of actual police,” says Ruha Benjamin, a professor of African American Studies at Princeton University and author of “Race After Technology: Abolitionist Tools for the New Jim Code.”

She calls modern police surveillance the “New Jim Code” because there are biases within the algorithms of these technologies.

Developers train this technology to identify threats and predict if someone is going to do something wrong, Benjamin says. The problem is that those predictions are based on “historic decisions” made by police and others.

“So if those people have been making discriminatory patterns and decisions, that's feeding the algorithm,” she says. “The algorithm, the smarter it gets, many times, the more biased it's becoming. That means it's mirroring human decisions and actions.”

Police use of technology becomes dangerous when people stop questioning the efficacy of the technology, Benjamin says.

“We think it's more neutral than, let's say, a racist judge or officer or a teacher because it's coming to us through a screen,” she says. “But someone had to teach that screen to make that decision and had to design that program. And that's what we need to keep an eye on.”

Interview Highlights 

On why temporarily banning police use of these technologies is not enough 

“It's really not enough for companies to offer this kind of short-term moratoria on the use of facial recognition for police only because there are many, many institutions, organizations in both the public and private sector that employ it. For example at UCLA, it was one of the first schools to announce its intention to use automated software to decide who could come on campus and not. And the only reason why it's not using it is because an independent entity, a digital rights nonprofit, actually audited the software that it was going to use and found that there were 58 false positives. That means students, faculty and staff that were matched with a criminal record and were predominantly people of color. And these were false matches.

Ruha Benjamin, an associate professor of African American Studies at Princeton University. (Courtesy)
Ruha Benjamin, an associate professor of African American Studies at Princeton University. (Courtesy)

“And so imagine being a student or faculty and having this system flag you as not supposed to be on campus as someone who is an intruder, and then what that kicks in in terms of campus police or LAPD. And so UCLA is just one of many places that has considered or is already using some kind of automated facial recognition program that makes all kinds of technical mistakes. But those technical mistakes often target the very people who are most vulnerable to police violence.”

On why surveillance technology has to be included in reforming police

“It's not an accident that the people who are most harmed by this institution are the descendants of people who were enslaved. And so the root of the institution is rotten. And so rather than thinking about how to sort of tweak the edges of it, I think we have to reimagine and rethink what actually constitutes safety and well-being in our communities.

“And so we have to be very wary that you might defund the police, the actual people who are walking and patrolling neighborhoods, but policing may continue through technology. And I think part of our goal in this moment is not to let that happen, to be vigilant so that the racist forms of policing that we're up against don't just shapeshift, become more invisible and impenetrable and unaccountable. And so let's go to the root causes and think about what will actually make us safe.”

On bias in neighborhood crowdsourcing apps 

“It's not just big institutions and organizations using technologies. We're all kind of enrolled in this process of tracking one another. And so if we know that this tracking is patterned according to race, like a lot of people use these kind of Nextdoor apps and things where they're looking around their neighborhood, who's an intruder, who's supposed to be here or not? All of those perceptions are shaped by racial bias and class bias. And so who's likely to be reported on the Nextdoor app or, you know, sort of the citizen app?

“And so there's no way to sort of take these technologies out of the social context. They are mirrors for us. There are mirrors for society. They're reflecting back at us our biases and our patterns of discrimination. And so to the extent that we can use them as mirrors and then try to deal with the root causes, they're useful. But if we really think that they're somehow neutral, that we should be making consequential decisions based on what's coming to us from these technologies, we're only going to deepen the inequalities that we're living with right now.”


Cristina Kim produced and edited this interview for broadcast with Tinku Ray. Samantha Raphelson adapted it for the web.

This segment aired on June 23, 2020.

Related:

Headshot of Tonya Mosley

Tonya Mosley Correspondent, Here & Now
Tonya Mosley was the LA-based co-host of Here & Now.

More…

Advertisement

More from Here & Now

Listen Live
Close