Amazon Buckles to Pressure Over Police Use of Facial Recognition – nitronet

0
37

As a national spotlight remains focused on police brutality, Amazon said it is implementing a one-year moratorium on the use of its facial recognition technology by police departments.

Advertisement
Dating Website Marketing

This marks a significant reversal for Amazon, which, until now, had not been swayed by calls from dozens of organizations like the American Civil Liberties Union (ACLU) to not sell the technology to law enforcement because people of color are disproportionately harmed by police practices. Amazon’s technology, known as Rekognition, they argue, could exacerbate the problem.

‘The most sophisticated, modern technology that exists’

A little over a year ago, for example, Amazon shareholders voted down a proposal that would have restricted the sale of Rekognition to government agencies. (Those shareholders also rejected a proposal to study the extent to which Rekognition may violate civil rights.)

nitronet reached out to Amazon about Rekognition following news from IBM earlier this week in which CEO Arvind Krishna told members of Congress IBM no longer offers its general purpose facial recognition software because the company opposes its use for surveillance, racial profiling and violations of human rights. Amazon did not respond.

Andrew Jassy, CEO of Amazon Web Services, which oversees Rekognition, has, however, spoken to PBS’ documentary series Frontline. In an interview that aired in February, he said Amazon believes police departments should be allowed to experiment with Rekognition because law enforcement should have access to “the most sophisticated, modern technology that exists.”

He noted Amazon has never received a report of misuse by law enforcement and he believed any abuse would be made public.

“We see almost everything in the media today and I think you can’t go a month without seeing some kind of issue that somebody feels like they’ve been unfairly accused of something of some sort, so I have a feeling that if you see police departments abusing facial recognition technology, that will come out … it’s not exactly kept in the dark when people feel like they’ve been accused wrongly,” Jassy told Frontline.

What a difference a few months makes.

Like any burgeoning technology, where, when and how to use facial recognition is a complicated issue.

Bias

First and foremost, multiple studies have demonstrated the technology is less capable of accurately identifying women and people of color.

In July 2018, for instance, the ACLU released a study in which Rekognition incorrectly matched 28 members of Congress with people in mugshots—and the false matches were disproportionately of people of color. (But it’s hardly the only study.)

At the time, an Amazon spokesperson pointed to uses that benefit society, such as preventing human trafficking and finding missing children, and said the ACLU test could have been improved by using a higher confidence threshold for matches, which is what it recommends for law enforcement. (Jassy repeated these talking points in his Frontline interview.)

The Amazon rep also noted Rekognition is “almost exclusively” used to narrow the field of possible suspects. But that’s not always the case with facial recognition. Look no further than the January 2020 investigation by the New York Times that found officers in Pinellas County, Fla.—which, to be clear, were using their own in-house database—sometimes used facial recognition as the basis for arrests when they had no other evidence.

Last February, Amazon said it was planning to work with the National Institute of Standards and Technology (NIST), the U.S. government lab with an industry benchmark for facial recognition, to develop standardized tests to remove bias and improve accuracy. Until that point, Amazon had not submitted Rekognition for testing alongside 76 other developers because it said its technology was too “sophisticated.”

Sixteen months later, however, a representative for NIST said Amazon still has not submitted an algorithm.

Transparency

Another part of the problem is the public doesn’t always know when facial recognition is in use.

Original Source

This site uses Akismet to reduce spam. Learn how your comment data is processed.