Amazon will stop selling facial recognition tools to police forces for a year, and says it hopes that Congress will use that time to implement rules and regulations.
The move comes after IBM pulled out of selling general purpose facial recognition entirely, amid widespread Black Lives Matter protests.
It is not clear what will happen to existing contracts, and with the company specifically stating it will not sell to the "police," federal agencies may continue to be sold to.
Your move, Microsoft
"We’re implementing a one-year moratorium on police use of Amazon’s facial recognition technology," Amazon said in a statement.
"We’ve advocated that governments should put in place stronger regulations to govern the ethical use of facial recognition technology, and in recent days, Congress appears ready to take on this challenge. We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested."
A reason for the policy change was not given, but it is likely a response to the ongoing protests following the death of George Floyd.
In 2018, the ACLU published a report showing that Amazon’s Rekognition facial recognition software was racially biased. The group used the AWS tool to match photos of 28 members of Congress with publicly available mug shots. "Nearly 40 percent of Rekognition’s false matches in our test were of people of color, even though they make up only 20 percent of Congress," ACLU attorney Jacob Snow said.
“People of color are already disproportionately harmed by police practices, and it’s easy to see how Rekognition could exacerbate that."
Campaigners faced a concerted effort by one of the world's largest companies to discredit their findings. An MIT Media Lab report on the biased performance results of commercial AI products was deemed "misleading" in a lengthy Amazon rebuttal but, at the same time, the company was secretly working to fix the issues it claimed did not exist.
This, one of the MIT paper's authors said, included "hiring an AI fairness lead, developing & releasing a new API version to do better on PPB, revamped marketing materials, funding an NSF grant [looking at AI fairness], hiring lobbyists against facial rec regulation."
Researcher Deborah Raji added: "At the same time Amazon tried to discredit us & our results."
Amazon shareholders previously voted down an employee-led proposal to limit the sales of Rekognition.
The service has been sold to countless law enforcement agencies, and are part of a concerted effort by the company to increase its ties with these lucrative bodies.
Amazon Ring, nominally a smart doorbell product, is actually used as part of the state surveillance apparatus. The company partners with more than 1,300 local police forces, which can ask for Ring video footage from users during investigations. "Ring facilitates these requests and user consent is required in order for any footage or information to be shared with local police," a spokesperson told DCD.
It also comes with an accompanying app, Neighbors, which has been used by residents to report “suspicious” behavior that is often just based on someone's skin color.
"We're glad Amazon has admitted that the unregulated use of face recognition can do harm to vulnerable communities," the Electronic Frontier Foundation said in a blog post. "Now it's time for it to admit the dangers of Ring-police partnerships, and stand behind its statement on police brutality."