'We might not employ or insure someone because they risk depression, or a country might use predictions about your sexual orientation against you,' Verdier said.
By Layli Foroudi
TUNIS, June 13 (Thomson Reuters Foundation) - Be it illness or an affair, your secret may no longer be safe as artificial intelligence gives government and business the power to crawl over personal data, France's digital envoy said on Thursday.
Information gleaned from everyday communications could then be used to exclude people from jobs, deny them insurance or curtail a myriad of freedoms, said Henri Verdier, France's top digital specialist, calling for stronger privacy rules.
He said the predictive powers of AI could mean deep trouble if information - a hastily written tweet or even a holiday booking - was used to divine an individual's future profile.
"We might not employ or insure someone because they risk depression, or a country might use predictions about your sexual orientation against you," Verdier said.
"AI is going to change the rules of the game because I can take a lot of data, educate a machine and then use it to know many things about you," he told the Thomson Reuters Foundation on the sidelines of Rights Con, a digital conference in Tunis.
The European Union's biggest shake-up of data privacy laws in more than two decades came into force a year ago, giving people more control over their online information and authorities the power to impose hefty fines.
But the laws do not go far enough, Verdier said, urging the EU to curb companies' and governments' predictive powers.
Plus it is already too late to simply switch off social media and hope your private life will stay that way.
Research published in January found that social media platforms such as Twitter can be used to glean information about the preferences of former users by monitoring as few as eight of their one-time contacts.
Verdier cited studies that show bank data can predict a divorce two years before it happens. A 2017 study found that advertisers are able to deduce the sexuality of a Facebook user based on as little as three likes.
"We live in data, we share it everywhere. I think we need to think about what are the things we shouldn't be allowed to predict about people" he said.
"There is historic precedent. In credit scoring, there are certain pieces of information you are not allowed to use. For example, in France, it is forbidden to use information about a person's religion to predict their credit score."
Citizens are generally unaware of the importance of guarding their data, Verdier said, which he attributed to the complex and opaque nature of privacy rules.
"There is a famous French play called 'Cyrano de Bergerac' ... there are more words in the terms and conditions of Amazon than there are in that play," he explained.
(Reporting by Layli Foroudi, Editing Lyndsay Griffiths and Zoe Tabary. Please credit the Thomson Reuters Foundation, the charitable arm of Thomson Reuters, that covers humanitarian news, women's and LGBT+ rights, human trafficking, property rights, and climate change. Visit http://news.trust.org)
Our Standards: The Thomson Reuters Trust Principles.