select all

Amazon’s Facial-Recognition Software Mistakes 28 Congressmen for Criminals

The ACLU used Amazon’s software to match congressional faces to mugshots from databases similar to this one on the right. Photo: United States House of Representatives; Police Handout

Are elected congressman convicted criminals? If you ask Amazon’s facial-recognition software, the answer might be yes.

Rekognition, Amazon’s highly controversial facial-recognition product, misidentified 28 members of Congress as people who had committed crimes on Thursday. Of those 28, 6 were members of the Congressional Black Caucus, fueling growing concerns of systemic racial bias latent in the facial-recognition software.

The ACLU, which conducted the test, used Amazon’s Rekognition software to compare images of these officials with a publicly available database of 25,00 mugshots. (In an email to Select All, the ACLU declined to identify the database out of privacy concerns.) The false matches transcended party lines, gender, age, and geographical region.

“An identification — whether accurate or not — could cost people their freedom or even their lives,” ACLU technology and civil liberties attorney Jacob Snow wrote. “People of color are already disproportionately harmed by police practices, and it’s easy to see how Rekognition could exacerbate that.”

Amazon’s Rekognition incorrectly matched these 28 members of congress with mugshots from criminal databases. Photo: United States Congress via ACLU

Among those misidentified was civil-rights icon and Georgia Democratic representative John Lewis, who helped the Congressional Black Congress pen a letter to Amazon in May predicting the dangers Rekognition poses to civil rights.

“It is quite clear that communities of color are more heavily and aggressively policed than white communities,” the letter reads. “The status quo results in an oversampling of data which, once used as inputs to an analytical framework leveraging artificial intelligence, could negatively impact outcomes in those oversampled communities.”

For many companies and governmental agencies, facial-recognition software, big data, and predictive algorithms were intended as a way to simultaneously increase efficiency and transcend our own inherent human biases. This sounds good in theory, but a growing body of research on data collection shows this hopeful wishing is indeed just that. In the end, “unbiased” algorithms and software are only as colorblind as the raw data they are fed.

This year, MIT researchers Joy Buolamwini and Timnit Gebru found that datasets are “overwhelmingly composed of lighter skinned subjects.” This imbalance in data leads to identification errors disproportionately grounded in race. In that same study, the error rate for light-skinned men capped at .8 percent. That seems pretty good, until you look at the error rate for those with dark skin.

Shockingly, the error rate for dark-skinned women was as high as 34 percent.

“We need to be really careful about how we use this kind of technology,” Diane Greene, Google’s director of cloud computing, said in an interview with the BBC following the ACLU post. Greene went on to warn of, “inherent biases” in facial recognition. (Google is working on its own facial-recognition software as well, but unlike Amazon, theirs is unavailable to the public.)

Underrepresentation and bias cause dysfunction at all levels, but the stakes dramatically increase when government agencies and police join the fold. In an interview with Gizmodo, Brian Brackeen, CEO of AI startup Kairos, who is himself a person of color, warned companies against selling these types of technologies to institutions possessing the power to kill.

“I’m much more comfortable selling face recognition to theme parks, cruise lines, or banks,” Brackeen said. “If you have to log into your [bank] account twice because you’re African-American, that’s unfair. But, you’re not gonna get shot.”

Brackeen is not alone. Last, month 20 groups of Amazon shareholders sent a letter to Jeff Bezos expressing concern over Rekognition’s potential to violate civil and human rights.

“We are concerned the technology would be used to unfairly and disproportionately target and surveil people of color, immigrants, and civil society organizations,” the shareholders wrote.

The general public is concerned as well. As of this writing, 60,114 people have signed an ACLU petition urging Amazon cease Rekognition sales to government and police.

It appears this growing tide of opposition has already convinced some. Last month, Orlando’s police department pulled the plug on Rekognition amid droves of criticisms. Making matters worse for Amazon, this new report comes amid a renewed public conversation over other controversial uses of mass data collection, like predictive policing. Whether or not this will have an impact on who Amazon sells its services to remains unknown.

Amazon did not respond to Select All questions about Rekognition’s apparent bias, or their decision to continue offering the service to law enforcement.

Amazon’s Rekognition Mistakes 28 Congressmen for Criminals