MIT study backs AOC on algorithm bias

facial recognition technology
facial recognition technology
Shutterstock

MIT study backs AOC on algorithm bias

These findings are only the latest to suggest that artificial intelligence and machine learning are subject to the same biases that humans are.
January 28, 2019

Days after U.S. Rep. Alexandria Ocasio-Cortez elicited a minor uproar over her claim that human bias is baked into facial recognition algorithms, a new study from MIT Media Lab found that Amazon’s facial recognition technology was prone to errors in determining the gender of female faces and darker-skinned faces. Ocasio-Cortez’s comments and the release of the study suggest that there’s a lot more to be learned about facial recognition technology before its use becomes widespread.

These findings are only the latest to suggest that artificial intelligence and machine learning are subject to the same biases that humans are. The MIT study found deficiencies in Amazon’s Rekognition technology, but similar gender and race biases have been found in studies of programs built by Microsoft and IBM.

Ocasio-Cortez criticized the technology not just because of the existence of bias but because of the implications in relation to its likely marriage to law enforcement. The New York Police Department has collaborated with IBM on video analytics software, while Amazon has pitched its Rekognition program to agencies including Immigration and Customs Enforcement. The U.S. Department of Commerce’s National Institute of Standards and Technology has been tasked with vetting this technology for accuracy, but because participation is optional, developers – including Google and Amazon – have not had their systems vetted. Facial recognition technology packs a big promise, but its developers will likely face continued calls for regulation and evaluation as the tech starts to creep into law enforcement.

For the rest of today's tech news, head over to First Read Tech.

Annie McDonough
is a tech and policy reporter at City & State.
20190223