Facial Recognition Tech Is a Blatant Misuse of Police Bodycams

Jim Watson/AFP/Getty
Jim Watson/AFP/Getty

When advocates pushed back against the rise of police bodycams over the last five years, all too often the answer was simply “it can’t hurt.” Did we have evidence that bodycams reduced excessive force? No. Did we know if they would prevent police perjury? No. Did we know if they would stop officers from killing innocent, unarmed black men in our streets. Definitely not.

But in the aftermath of Eric Garner, Michael Brown, and Tamir Rice’s killings, many Americans cried out for something, anything. To some, bodycams seemed worth a chance.

Five years later and it’s increasingly clear that the cameras we hoped would police the cops are actually becoming law enforcement’s own eyes and ears. Emerging technologies are transforming this one-time hope for police reform into the tool of a dystopian surveillance state, and it’s all powered by facial recognition technology.

That’s why California lawmakers acted last week to ban police from using facial recognition on bodycams. Well, they actually went a lot further than that. AB1215 says that police shall not use “any biometric surveillance system in connection with an officer camera or data collected by an officer camera.” In short, it prevents police from turning the countless millions spent on bodycams into a walking police surveillance net.

So facial recognition is out, but so are countless other emerging forms of AI-driven surveillance, each one creepier than the next. Gait analysis software seeks to identify us by the way we walk. Even “aggression detection” AI sells the snake-oil elixir of promising to figure out our state of mind from how we walk, our facial expressions, and other biometric data. Of course, it can’t make those determinations—not consistently and in a non-discriminatory fashion.

For anyone who has been following the AI debate over facial recognition, the concerns will be familiar. The AI tech we use is built off of countless decisions and assumptions that shape the answers we get back. But human beings aren’t just a homogeneous data set.

This is especially true when tech intersects with programmers’ blind spots. Have an algorithm put together by a bunch of white male programmers and it’s unsurprising that it ends up being pretty good for white guys and pretty terrible at everyone else. From facial recognition AI that falsely claimed that members of the Congressional Black Caucus were criminals to “emotional analysis” AI that mislabels black users aas angry, the AI we birth reflects back the biases we hold.

It doesn’t take much imagination to see the possibilities in the context of bodycams. Think of a police officer whose smartphone alerts him anytime he passes a person that his system flags as acting “suspiciously.” And of course, AI bias means that many of the same over-policed communities are once again forced to bear the brunt of surveillance.

And what would we get in return for this man-portable panopticon? Not much. The reality of bodycams has fallen far short of the dream. One issue is that police departments frequently stonewall disclosure of unfavorable recordings. In New York, the NYPD failed to hand over bodycam footage in 40 percent of cases when it was requested by civilian watchdogs. Yet, departments frequently find ways to quickly release or leak the footage they want the public to see. Just as importantly, when early adopters studied if bodycams reduced use of force, civilian complaints, or excessive charges, researchers found no effect.

Of course, biometric surveillance is a far broader discussion than just bodycams. Even after the historic win in California, police departments in much of the state will still be able to use facial recognition on images from CCTV cameras, dashcams, and potentially even drones. But the measure is a huge step towards a broader national consensus on ending the use of biometric surveillance. Already, more than a dozen cities have passed so-called “CCOPS laws”, which require police departments to give civilians oversight of surveillance technology.

Perhaps the most prominent CCOPS law in recent months is the San Francisco ordinance that not only gave lawmakers power to review all surveillance equipment purchases, but which also banned all government facial recognition systems. Massachusetts is considering a measure that would go further still, banning all government uses of facial recognition state-wide.

No one biometric ban will be the privacy panacea, but the protections are particularly profound in police bodycams. There’s something unbearable about thinking that our country’s largest investment ever in police accountability would be turned into a weapon against the very communities of color that it was supposed to protect. The only question is whether more states will follow California’s lead before the technology takes root nationwide.

Read more at The Daily Beast.

Get our top stories in your inbox every day. Sign up now!

Daily Beast Membership: Beast Inside goes deeper on the stories that matter to you. Learn more.

Advertisement