OXFORD, England (Thomson Reuters Foundation) – Around the world, “almost every government is vacuuming up any data you send through the net,” warns Jim Fruchterman, chief executive of Benetech, a California non-profit group that aims to use technology to create positive social change.
Sometimes that data is used for good, by for instance tracking and treating disease outbreaks or helping get assistance to people displaced by disasters, he and other data experts say.
But it can also present huge risks. Uganda’s government, which last December passed a law making homosexual sex a criminal offence punishable by up to life in prison, has sought out information online identifying people as gay or lesbian, then outed them to their landlords or their families, putting lives at risk in a country where homosexual activists have been murdered, Fruchterman said.
In cases where governments gather data on sensitive issues from sexual orientation to HIV/AIDS status, there’s “a risk of being victimised in the future, if not by the current government then a future government,” the Palo Alto technology entrepreneur said during a discussion on “big data” at the Skoll World Forum on Social Enterpreneurship in Oxford this week.
As non-governmental organisations and social enterprises gather data on the communities and people they help, they need to be keenly aware that “we should treat other people’s data the way we want our data treated,” he said.
Finding the right balance between privacy and effective intervention can be particularly tricky, data experts warned.
After the 2011 Fukushima earthquake, tsunami and nuclear power plant meltdown in Japan, an aid group working with disabled and blind people in the area went to mobile phone companies to extract data on the electronic “pings” of their clients’ phones, to try to track where they had fled or what had happened to them, and get them support and help, said Kenn Cukier, data editor for The Economist.
But the phone companies refused to provide the data, saying it would compromise the privacy of the phone users – a decision in line with the company’s rules but probably “against the interests of the individuals,” Cukier said.
These days, “we need a new maturity in how we use data,” he said. “You can’t just say let’s keep it aggregated and never disaggregate it. It depends on the case.”
How much data is out there these days? “Google has every keystroke ever put into Google,” said Larry Brilliant, president of the Skoll Global Threats Fund. Tracking an individual’s gender, sexual orientation, even details like whether they smoke or are truanting from school, is now relatively simple for those with access to their Facebook account. Everyone, from governments and militaries to businesses, is amassing huge amounts of data on individuals.
That is leading to calls for greater privacy, and limits on data collection and use. But getting the balance right is difficult, the experts said.
Collecting and analyzing electronic patient records, for instance, could improve healthcare and lower costs – but inherently violates privacy, Cukier noted.
Increasingly, in the balance between privacy and effective use of data, “you have to sacrifice one to get the other. There’s going to be a winner and a loser,” he said.
To deal with that, people should be given the choice of opting out of data sets, he said. But too many people opting out reduces the accuracy and value of the data, and “there’s no obvious way to resolve the two.”
Still, opt-in data can work. A “Flu Near You” project in the United States, Australia and the Netherlands asks volunteer users to report their health status every week, and by aggregating the information health experts are beginning to see patterns in the spread of flu that could ultimately be useful in predicting the spread of flu pandemics, Cukier said.
“The key is (to be) totally above board, open source, opt-in and voluntary,” he said. “Maybe that’s a good lesson we can take from all the (surveillance whistleblower Edward) Snowden stuff.”
Our Standards: The Thomson Reuters Trust Principles.