This website uses cookies to ensure you get the best experience. Learn more

Trans And Non-Binary People Are Not Identified In Facial Recognition AI

Facial recognition use has been rising as law enforcement, immigration services, banks and other institutions rely on it more and more. 


But “A recent study by computer-science researchers at the University of Colorado Boulder found that major AI-based facial analysis tools—including Amazon’s Rekognition, IBM’s Watson, Microsoft’s Azure, and Clarifai—habitually misidentified non-cisgender people.”, Quartz reports. 




The article continues, “The researchers gathered 2,450 images of faces from Instagram, searching under the hashtags #woman, #man, #transwoman, #transman, #agenderqueer, and #nonbinary.


“The images were then divided by hashtag, amounting to 350 images in each group. Scientists then tested each group against the facial analysis tools of the four companies. 


“The systems were most accurate with cisgender men and womenwho on average were accurately classified 98% of the time. Researchers found that trans men were wrongly categorized roughly 30% of the time. The tools fared far worse with non-binary or genderqueer people, inaccurately classifying them in all instances.”





Keep up to date with the latest myGnews 

Sign up to mygwork

________

LGBT professionals, LGBT Graduates, LGBT professional network, LGBT professional events, LGBT networking events, LGBT Recruitment, LGBT Friendly organisations, LGBT Friendly companies, LGBT jobs

Share this

myGwork
myGwork is best used with the app