NewsAmerica in Crisis

Actions

Tech giants pull facial recognition software from police departments – for now

Tech giants pull facial recognition software from police departments –  for now
Posted at 4:06 PM, Jun 12, 2020
and last updated 2020-06-12 16:34:59-04

SAN DIEGO, Calif. -- Tech giants say they will not sell facial recognition software to police departments, for now.

It's a tool police departments have been using for years, helping solve everything from property crimes to cold cases and missing people.

But there's little oversight over the technology, and critics say it puts our privacy and civil rights in jeopardy.

While police often use the software to scan the mug shots of criminals, there's a good chance your photo is also in the system.

A 2016 Georgetown Law report found one in two American adults are in a law enforcement face recognition network. In addition to mug shots, social media photos and surveillance videos, many states also allow searches of driver's licenses databases.

Critics of the technology also point to inaccuracies in the software.

In 2018, researchers at MIT and Stanford University examined three commercially released facial-analysis programs from major technology companies.

The analysis showed an error rate of 0.8% for light-skinned men compared to 34.7% for dark-skinned women.

Steve Beaty is a professor of computer science at MSU Denver.

"It appears these programs have, what we call, biases in them. That they're biased towards certain skin tones, for example, and will make more mistakes with certain types of people than other types of people," said Beaty.

He says the bias can occur when the machines are trained.

"The computers I don't think have any inherent bias in themselves, but they can only learn from the data sets they're provided with," said Beaty.

If a machine sees more photos of white males while being trained, it will be able to identify them more accurately.

And while the technology has proven to be a useful-crime fighting tool, a case of mistaken identity can mean an innocent person ends up with police looking into their private lives unnecessarily.

"I think it's a good idea to take a step back and say what is it we as a society want from our facial recognition technology? That's exactly what Amazon has come out and said," said Beaty.

This week, Amazon announced a one-year moratorium on police use of their facial recognition technology, Rekognition. The company is calling on lawmakers to put in place stronger regulations to govern the technology's ethical use.

Microsoft also said it will not sell its software to police departments for now, while IBM is abandoning its facial recognition program altogether.

"Let's talk about what it means, and have the conversation, and make sure that we as a society, as a country, are comfortable with what the technology is being used for," said Beaty.

As companies reevaluate how police officers use their technology, the question remains if the public will do the same.