BEGIN ARTICLE PREVIEW:
Many if not all commercial facial recognition algorithms seem to share common roots that make them susceptible to demographic performance differentials, or bias, but also suggest a common fix, John Howard told an audience during the International Face Performance Conference (IFPC) 2020. The conference is hosted by the National Institute of Standards and Technology (NIST) and DHS’ Science and Technology Directorate (S&T) along with the European Association of Biometrics (EAB) and the UK’s National Physical Laboratory.
Howard presented the research, with Yevgeniy Sirotin also participating in the conversation in one of eleven sessions held on day one of IFPC 2020. Both researchers are associated with DHS’ Maryland Test Facility (MdTF).
The method of testing and the mathematics that show demographic clustering were briefly explained, with the presentation also touching on the implications of the research and the conclusions that can be drawn at a high level.
By selectively removing principal components that showed high degrees of clustering and then reconstructing the data, the distributions between the groups of volunteers with the same gender and race and different gender and race, as well as mated pairs, were reduced, though not dramatically.
Communicating the undesirability of “broad homogeneity” in facial recognition algorithms, …
END ARTICLE PREVIEW