Export Compliance Daily is a Warren News publication.

NIST Reports More False Positives for Minority Face-Scanning

Most facial recognition technology algorithms show evidence of “demographic differentials,” or racial bias, the National Institute of Standards and Technology reported Thursday. For one-to-one matching, a study found higher rates of false positives for Asian and African American faces “relative…

Sign up for a free preview to unlock the rest of this article

Export Compliance Daily combines U.S. export control news, foreign border import regulation and policy developments into a single daily information service that reliably informs its trade professional readers about important current issues affecting their operations.

to images of Caucasians,” NIST said. “While it is usually incorrect to make statements across algorithms, we found empirical evidence for the existence of demographic differentials in the majority of the face recognition algorithms we studied,” said NIST computer scientist Patrick Grother. The study evaluated 189 software algorithms from 99 developers, “a majority of the industry.”