The limitations of facial recognition technology have been widely reported, and that’s what makes it all the more confusing that Amazon would sell its tech to police.
It’s known that facial recognition technology has difficulty recognizing darker skin and women, which is one thing when it’s being used log in to your phone, but Amazon is being taken to task by artificial intelligence researchers in an open letter for selling something so flawed to law enforcement.
Open Letter to Amazon
It makes this story that much more delicious to know that some of the AI researchers authoring the open letter are from Amazon competitors: Google, Microsoft, and Facebook. This letter, asking Amazon to stop selling their flawed facial-recognition technology, was published on Medium.
As a backdrop to this story, facial recognition has been criticized as it became a more popular sign-in tool, because it shows what appears to be a bias. However, it’s not an intentional bias – but it’s still true, nonetheless, that a bias exists, intentional or not.
The reason given for the disparity in recognizing faces in the case of Microsoft is the data the tech was built around. When creating the tech, Microsoft fed it pictures to learn how to recognize faces. Yet the pictures used were mostly white males. This has led to it not being very adept at recognizing darker skin or women.
The Letter reports that Amazon’s Rekognition tool has “much higher error rates while classifying the gender of darker-skinned women than lighter-skinned men (31% vs. 0%).
The researchers believe that Amazon officials Matthew Wood and Michael Punke “misrepresented the technical details for the work and the state-of-the-art in facial analysis and face recognition.”
They go on to state that “bias found in one system is cause for concern in other, particularly in use cases that could severely impact people’s lives, such as law enforcement applications.” They add that there aren’t any “required standards to ensure that Rekognition is used in a manner that does not infringe on civil liberties.”
Citing a study that found racial and gender disparity in facial recognition technology, the researchers then went on to point out that Wood and Punke tried to refute the research, calling it “misleading” and stating it drew “false conclusions.”
They also called into question Wood’s insistence that there is a difference between “face recognition” and “facial analysis.” The researchers found that the two are actually closely related and that it’s common to exchange one for the other.
They feel it’s important to test Rekognition in “the real word, in ways that it is likely to be used,” including ” ‘black box’ scenarios, where users do not interact with inner details of the system such as the model, training data, or settings.”
With this technology being used by the police, the researchers quote attorneys who explained “the wrong people may be put on trial due to cases of mistaken identity.” They believe the police “do not know the parameters of these tools, nor how to interpret some of their results.”
What Recourse Does the Public Have?
“We call on Amazon to stop selling Rekognition to law enforcement, as legislation and safeguards to prevent misuse are not in place,” the researchers request.
Certainly, that should be true, that if facial recognition technology is flawed, it shouldn’t be used to make life decisions. But what recourse do we have? With that technology already being used by police, what can we do to help that?
Should the fingers point back at Amazon? Or should they point to the police? What do you think should be done? Tell us how you feel about Amazon’s flawed facial recognition technology in the comments below.