Tech startups say their AI systems can interrupt biases in hiring and promotion, but experts warn that algorithms are no silver bullet
By Sonia Elks
LONDON, Nov 17 (Thomson Reuters Foundation) - Is it because she is a mother? Or perhaps she is perceived as lacking ambition, or leadership qualities?
Gender stereotypes continue to hold women back at work, but a handful of tech firms say they have developed artificial intelligence (AI) systems that can help break biases in hiring and promotion to give female candidates a fairer chance.
Employers and the wider economy could stand to gain, too.
"We are at this moment in artificial intelligence, that we either have the ability to hardwire our biases into the future or ... to hardwire equity," said Katica Roy, chief executive of Colorado-based software firm Pipeline Equity.
"A lot of the time that we talk about equity, we talk about it as a social issue or the right thing to do, which it is, but it's actually a massive economic opportunity."
Organisations are increasingly turning to AI to help make hiring decisions, prompting concern among digital rights experts who warn that algorithms can perpetuate biases.
An AI hiring tool developed by Amazon had to be scrapped after it taught itself male candidates were preferable to women.
But women's rights groups and digital experts said well-designed tech aimed at targeting bias can "shine a light" on the hidden factors holding women back.
"Bias is as old as human nature, and traditional hiring practices have been shot through with a number of different biases," said Monideepa Tarafdar, a professor in the Isenberg School of Management at the University of Massachusetts Amherst.
"I think AI can be part of the solution. Definitely. But I do not think it can be the only solution."
These equality-focused technology firms are using AI to bypass or review decisions such as scanning CVs or deciding pay rises, and offer personalised, data-based advice.
Software developed by Pipeline Equity, a startup founded in 2017, has a number of human resource uses - from checking for biased language in performance reviews to offering advice on hiring and promotions.
Textio also uses AI to analyse companies' corporate statements and job postings to identify whether they are adopting a masculine tone that will alienate women or members of minority groups, and suggesting more inclusive alternatives.
Pymetrics, another leading firm in the space, offers gamified assessments that it says evaluate potential hires more fairly than reading CVs.
"We have heaps and binders full of this business case, and it has shifted some mindsets," said Henriette Kolb, head of the Gender and Economic Inclusion Group at the World Bank's private-sector arm, the International Finance Corporation.
But much more needs to be done to improve women's financial inclusion worldwide, from increasing corporate representation to widening their access to banking, she told the Trust Conference, the Thomson Reuters Foundation's annual flagship event.
COVID-19 has spurred a "shecession" that has seen a disproportionate number of women pushed out of the labour force. The International Labour Organization found gender gaps have widened and women's employment is set to recover more slowly.
Meanwhile, companies are struggling to fill open positions with record numbers quitting in the United States in what has been dubbed "the great resignation".
"Businesses have so many roles that they're unable to fill, I mean, empty seats can't do your work for you," said Kieran Snyder, chief executive of Textio.
"You need to hire great people if you're going to have any kind of success."
HELPING OR SPYING?
But AI will not be a silver bullet in creating fairer workplaces, women's rights advocates and researchers said, warning that the technology could raise as many problems as it solves.
The idea that technology offers some kind of unbiased factual truth or objectivity is an illusion, said Manish Raghavan, a postdoctoral fellow at the Harvard Center for Research on Computation and Society.
"All AI has to learn from data in some way; it has to learn from past decisions," he said.
"That's not to say it's impossible to use technology to mitigate your own implicit biases, I think it just has to be very, very carefully designed. And I honestly just don't think we're at that point yet where we're able to do that."
A lack of transparency about how most commercial algorithms work makes it hard to scrutinise their performance, he added.
Tarafdar, who is leading a research project to analyse how AI can lead to unintentional workplace bias, said effective solutions cannot just pinpoint key hiring decisions but must also look at the wider workplace culture.
Bosses should also carefully consider how much data they can gather on workers before their actions slip from helping towards surveillance, she added.
The real key to change is opening difficult, honest, conversations about bias that can challenge misconceptions, said Allyson Zimmermann, a director of women's workplace rights organisation Catalyst.
But AI tech can help to upend those preconceptions and open opportunities, she added, citing the case of a young woman who got an interview after being selected using technology that "blinded" recruiters as to her gender and age.
"When she showed up for the interview, they just burst out laughing. And it wasn't, you know, a rude kind of laughing. They were so shocked that she was this young woman," she said.
"It really opened their eyes; they thought they would have a middle-aged man coming in ... She went into the interview, she got the job. She told me it was an extremely positive experience."
This story was updated on November 17 to correct the year Pipeline Equity was founded and to add comment from the Trust Conference.
(Reporting by Sonia Elks @soniaelks; Editing by Helen Popper and Katy Migiro. Please credit the Thomson Reuters Foundation, the charitable arm of Thomson Reuters, that covers the lives of people around the world who struggle to live freely or fairly. Visit http://news.trust.org)
Our Standards: The Thomson Reuters Trust Principles.