Draft EU rules include curbs on AI technology like facial recognition, but not on systems that detect gender, sexuality, race or disability
By Avi Asher-Schapiro
April 16 (Thomson Reuters Foundation) - A cross-party group of European lawmakers called on Friday for an EU ban on artificial intelligence (AI) systems that detect and label people according to gender or sexuality, saying the technology was ripe for abuse and could fuel discrimination.
Draft rules set to be announced by the European Commission next week include rights safeguards and curbs on AI technology such as facial recognition tools, but do not ban systems that detect gender, sexuality, race or disability, the lawmakers said.
"To reduce people to appearances is discrimination – whether it is done by humans or machines," Alexandra Geese, who drafted an open letter signed by 33 fellow MEPs, told the Thomson Reuters Foundation in emailed comments.
"The potential harm of these technical applications outweighs their benefits so blatantly – if they even offer any benefits – that Europe should unequivocally turn its back on them by law."
Another MEP who signed the letter, Karen Melchior, said there was "no need for technologies to be deciding who is male and who is female, who is gay and who is not".
The lawmakers' letter came after a coalition of rights groups, including All Out, a global LGBT+ rights organisation, and the digital rights group Access Now, gathered 30,000 signatures calling for Brussels to implement such a ban.
After the Commission - the EU's executive arm - formally presents its regulatory proposals on April 21, they will need to be thrashed out with EU countries and MEPs before becoming law.
'HORRIBLE USES'
Artificial intelligence systems that identify and process images of peoples' faces often include a process that divides people into two genders - male and female, said Os Keyes, a PhD student studying such systems at the University of Washington.
Researchers have begun building tools to identify sexuality from photographs, including a 2017 research project at Stanford University that said its AI system correctly identified gay men 83% of the time in an analysis of some 35,000 facial images.
Large-scale use of such systems has yet to take place in Europe, but a ban by the 27-member bloc would set a red line for the technology's future in Europe and beyond, said Daniel Leufer, Access Now's European policy analyst.
"You can only think of horrible uses for this technology, especially for trans and non-binary people, such as regulating bathroom access," Leufer said.
Non-binary people do not identify as either male or female.
Advocates also worry about the technology's application in places such as border control posts, where face scans can be used to identify travelers, and in advertising.
Last year, Access Now protested an advertising system in the Brazilian city of Sao Paulo, which used facial recognition technology to display different adverts to people walking past based on their perceived gender.
Keyes said gender-detecting AI had been used on a small scale in Europe - citing a Berlin programme that used face scans to offer women discounted tickets on "Equal Pay Day".
AI systems that automatically categorize the population into two genders reinforce gender stereotypes, Keyes said.
"You can't have a gender recognition system that doesn't end up contributing to some sort of bias," Keyes said.
Such technology will also fail to work among people who are either transgender or non-binary, the researcher added: "If we start adopting this tech, millions of people will be excluded."
Related stories:
In UK 2021 census, binary male or female question is controversial
OPINION: The EU must follow its 'LGBTIQ Freedom Zone' declaration with action
'Algorithms of oppression': Big tech urged to combat inequality
(Reporting by Avi Asher-Schapiro in Oakland, California, @AASchapiro; Additional Reporting by Rachel Savage in London; Editing by Helen Popper and Hugo Greenhalgh. Please credit the Thomson Reuters Foundation, the charitable arm of Thomson Reuters, that covers the lives of people around the world who struggle to live freely or fairly. Visit http://news.trust.org)
Our Standards: The Thomson Reuters Trust Principles.