A report by human rights group Article 19 found dozens of Chinese firms marketing technology to measure human emotions
By Avi Asher-Schapiro
Jan 27 (Thomson Reuters Foundation) - Technology that measures emotions based on biometric indicators such as facial movements, tone of voice or body movements, is increasingly being marketed in China, researchers say, despite concerns about its accuracy and wider human rights implications.
Drawing upon artificial intelligence, the tools range from cameras to help police monitor a suspect's face during an interrogation to eye-tracking devices in schools that identify students who are not paying attention.
A report released this week from UK-based human rights group Article 19 identified dozens of companies offering such tools in the education, public security and transportation sectors in China.
"We believe that their design, development, deployment, sale and transfers should be banned due to the racist foundations and fundamental incompatibility with human rights," said Vidushi Marda, a senior programme officer at Article 19.
Human emotions cannot be reliably measured and quantified by technology tools, said Shazeda Ahmed, a PhD candidate studying cybersecurity at the University of California, Berkeley and the report's co-author.
Such systems can perpetuate bias, especially those sold to police that purport to identify criminality based on biometric indicators, she added.
The systems also raise concerns about an emerging trend of collecting emotional data to monitor students, suspected criminals and even drivers of cars equipped with recognition tech to detect fatigue and unsafe movements.
"A lot of these systems don't talk about how they would act on the data and use it long term," Ahmed told the Thomson Reuters Foundation in a phone interview.
"We are quite concerned about function creep," she added, referring to the use of data for purposes other than those for which it was collected.
Even seemingly benign applications of emotional recognition tech could lead to harm, Marda noted. "There's a slippery slope with this kind of surveillance," she said.
"Let's say a school wants to introduce a camera system to see if students are eating nutritious meals at school - it then may be quickly transformed into an emotion recognition system, and then who knows what next?"
Many of the companies identified in the report are smaller Chinese startups that specialise in a particular kind of emotional recognition tool.
But the report also pointed out some major international firms involved in the market.
Lenovo, the world's biggest personal computer maker, for example, markets "smart education solutions" that include "speech, gesture and facial emotion recognition", noted the report.
The firm has sold education technology to more than a dozen Chinese provinces, but Article 19 researchers say it is unclear how many have deployed the systems.
Lenovo did not immediately respond to a request for comment.
Article 19 is worried that the kind of technology being marketed in China could be increasingly hooked into surveillance systems all over the world.
"When you have CCTV all around a city it doesn't cost that much to add a new emotion recognition service," explained Marda. "This is not just a China problem."
At this stage, the global market for emotional recognition technology is relatively small, the report said.
But the researchers cautioned that it is developing quickly, and without much scrutiny.
"We documented around 30 companies selling this technology," Ahmed said. "That very well could be just the tip of the iceberg."
Chinese tech patents tools that can detect, track Uighurs
Who owns your data? It's complicated
'I know your favourite drink': Chinese smart city to put AI in charge
(Writing by Avi Asher-Schapiro @AASchapiro, Editing by Jumana Farouky and Zoe Tabary. Please credit the Thomson Reuters Foundation, the charitable arm of Thomson Reuters, that covers the lives of people around the world who struggle to live freely or fairly. Visit http://news.trust.org)
Our Standards: The Thomson Reuters Trust Principles.