Export Compliance Daily is a Warren News publication.

Amazon Takes Issue With Report Saying Its Facial Technology Could Be Biased

Amazon disputed study results about its Rekognition system in a Friday report that MIT Media Lab found the technology had much more difficulty telling the gender of female faces and darker-skinned faces in photos than similar services from IBM and…

Sign up for a free preview to unlock the rest of this article

Export Compliance Daily combines U.S. export control news, foreign border import regulation and policy developments into a single daily information service that reliably informs its trade professional readers about important current issues affecting their operations.

Microsoft. It misclassified women as men 19 percent of the time and mistook darker-skinned women for men 31 percent of the time. Microsoft's technology mistook darker-skinned women for men 1.5 percent of the time. The results published were based on facial analysis “and not facial recognition,” an Amazon spokesperson emailed, quoting Matt Wood, a member of Amazon Web Services’ machine learning team. “Analysis can spot faces in videos or images and assign generic attributes such as wearing glasses; recognition is a different technique by which an individual face is matched to faces in videos and images.” It's impossible to draw a conclusion on the accuracy of facial recognition for any use based on results obtained using facial analysis, said Wood, noting the study didn’t use the latest version of Rekognition and results didn’t represent how a customer would use the service today. Using an updated version with similar data, “we found exactly zero false positive matches with the recommended 99% confidence threshold,” he said. Amazon continues to improve the technology.