NEW YORK – Amazon has emerged as a frontrunner in the field of facial recognition. However, experts argue that it shows gender and racial biases.
The technology is used in police departments and Immigration and Customs Enforcement in the US. According to a study published by the MIT Media Lab on January 24, the facial technology makes more mistakes differentiating between men and women, and has a more difficult time recognising the gender of women who are dark-skinned.
The Rekognition software from Amazon inaccurately recognized women as men 19 percent of the time, the study reported.
Additionally, it incorrectly identified women with darker skin 31 percent of the time.
In comparison, Microsoft software identified women with darker skin than men 1.5 percent of the time.
The tests were conducted by MIT’s Joy Buolamwini, who found similar racial and gender biases in February last year too. Amazon denied the research.
It noted that the researchers had not tested the latest version of