×

Our award-winning reporting has moved

Context provides news and analysis on three of the world’s most critical issues:

climate change, the impact of technology on society, and inclusive economies.

Apple Card and six other tech tools accused of gender bias

by Umberto Bacchi | @UmbertoBacchi | Thomson Reuters Foundation
Monday, 11 November 2019 18:10 GMT

A woman holds an iPhone 11 Pro Max while giving a live broadcast after it went on sale at the Apple Store in Beijing, China, September 20, 2019. REUTERS/Jason Lee

Image Caption and Rights Information

Apple became the latest tech giant to face criticism last week when customers of its new credit card service said it appeared to give men higher credit limits than women

By Umberto Bacchi

LONDON, Nov 11 (Thomson Reuters Foundation) - From recruitment software that favours male applicants to facial recognition technology that fails to recognise transgender people, a growing number of artificial intelligence (AI) programmes have been accused of holding human gender bias.

Apple became the latest tech giant to face criticism last week when customers of its new credit card service, including company co-founder Steve Wozniak, said it appeared to give men higher credit limits than women.

Here are six other tech tools that have been accused of gender discrimination:

* FACEBOOK ADS A U.S. study this year found Facebook's algorithms matching marketing for housing and jobs with viewers leant on stereotypes. Ads for jobs in the lumber industry went mostly to white men, while secretary positions were mostly directed at black women, according to the study.

* AMAZON'S RECRUITING TOOL Amazon scrapped an experimental automated recruiting engine that used AI to give job candidates scores ranging from one to five stars after finding it did not like women. Amazon's computer models were trained to vet applicants by observing patterns in resumes submitted to the company. But as most came from men, reflecting male dominance across the tech industry, the system had taught itself that male candidates were preferable.

* DIGITAL ASSISTANTS A United Nations report this year said popular digital assistants styled as female helpers such as Apple's Siri, Amazon Alexa and Microsoft's Cortana reinforced sexist stereotypes and normalised sexist harassment. Styled as female helpers, most voice assistants were programmed to be submissive and servile - including politely responding to insults.

* FACIAL RECOGNITION Facial recognition technology struggles to recognise transgender people and those who do not define themselves as male or female, according to an October study by the U.S. University of Colorado Boulder. Researchers tested facial recognition systems from IBM, Amazon, Microsoft and Clarifai on photographs of trans men and found they were misidentified as women 38% of the time.

* GOOGLE IMAGES A 2015 University of Washington study found women were underrepresented in Google Images search results for most jobs and slightly underrepresented for some of them, including CEO. The researchers said the issue could have had a negative impact on people's perceptions, reinforcing bias and preconceptions.

* JOB COACHING ADS Another 2015 study by Carnegie Mellon University in the U.S. found that Google's ad-targeting system was more likely to show offers for job coaching services for highly-paid positions to men than women. 

(Sources: Reuters, Thomson Reuters Foundation, University of Washington, Carnegie Mellon University)

(Reporting by Umberto Bacchi @UmbertoBacchi, Editing by Claire Cozens. Please credit the Thomson Reuters Foundation, the charitable arm of Thomson Reuters, that covers humanitarian news, women's and LGBT+ rights, human trafficking, property rights, and climate change. Visit http://news.trust.org)

Our Standards: The Thomson Reuters Trust Principles.

-->