Facial recognition technology struggles to see past gender binary

Wednesday, 30 October 2019 22:08 GMT

People walk past a poster simulating facial recognition software at the Security China 2018 exhibition on public safety and security in Beijing, China October 24, 2018. REUTERS/Thomas Peter

Image Caption and Rights Information
Facial recognition technology is failing to recognise transgender people new research has shown, raising concerns about discrimination by the software

By Molly Millar

LONDON, Oct 30 (Openly) - Facial recognition technology is failing to recognise transgender people new research has shown, raising concerns about discrimination as the use of the software becomes increasingly prevalent.

Researchers at the U.S. University of Colorado Boulder tested facial recognition systems from tech giants IBM, Amazon, Microsoft and Clarifai on photographs of trans men and found they were misidentified as women 38% of the time.

Cisgender women and men - or those who identify as their birth gender - were correctly identified 98.3% of the time and 97.6% of the time respectively.

The software also failed to recognise people who did not define themselves as male or female - also known as nonbinary, agender or genderqueer - 100% of the time.

The results highlight that even the most up-to-date technology only view gender in two set categories, the report's lead author Morgan Klaus Scheuerman said in a statement.

"While there are many different types of people out there, these systems have an extremely limited view of what gender looks like," Scheuerman said.

Facial recognition remains highly controversial but is gaining in use by police and immigration services. The market for the technology is predicted to double in the next 15 years, according to research group MarketsandMarkets.

Software that excludes trans and nonbinary people may prove discriminatory, rendering such persons invisible to a technology that is becoming increasingly incorporated into daily life.

Misidentification can even be actively harmful, such as at airport security where trans people are often subject to invasive body searches or harassment if their ID does not match their gender.

The Transportation Security Administration (TSA) is currently rolling out facial recognition at airports across the United States.

A spokesman from LGBT+ group Stonewall said: "It's concerning to hear that facial recognition software is misgendering trans people. The experience of being deliberately misgendered is deeply hurtful for trans people.

"We would encourage technology developers to bring in and consult with trans communities to make sure their identity is being respected."

The study also suggested the software relies on outdated gender stereotypes in its facial analysis. Scheuerman, who is male with long hair, was categorized as female half of the time.

"When you walk down the street you might look at someone and presume that you know what their gender is, but that is a really quaint idea from the '90s and it is not what the world is like anymore," said senior author Jed Brubaker, an assistant professor of Information Science.

"As our vision and our cultural understanding of what gender is has evolved, the algorithms driving our technological future have not. That's deeply problematic."

(Reporting by Molly Millar, editing by Chris Michaud. Please credit the Thomson Reuters Foundation, the charitable arm of Thomson Reuters, that covers humanitarian news, women's rights, trafficking, property rights, climate change and resilience. Visit news.trust.org to see more stories.)

Openly is an initiative of the Thomson Reuters Foundation dedicated to impartial coverage of LGBT+ issues from around the world.

Our Standards: The Thomson Reuters Trust Principles.

Themes
Update cookies preferences