×

Our award-winning reporting has moved

Context provides news and analysis on three of the world’s most critical issues:

climate change, the impact of technology on society, and inclusive economies.

Project targets common humanitarian data collection

Friday, 31 January 2014 11:36 GMT

* Any views expressed in this article are those of the author and not of Thomson Reuters Foundation.

Creating a common standard for humanitarian data collection and a shared database or catalogue will be the dual focus of a UN-led project, the Humanitarian eXchange Language (HXL) initiative, whose working group met for the first time last week (20 January).
 
The year-long project, which is run by the UN Office for the Coordination of Humanitarian Affairs, received funding from the Humanitarian Innovation Fund in November.
 
By producing guidelines on data type and format, as well as collection methods, the initiative will allow the direct comparison of information from different humanitarian organisations, says David Megginson, head of HXL’s standards efforts.
 
While standards on how to deliver humanitarian aid exist, the field lacks a coherent system for data management, both in terms of its collection and storage, he tells SciDev.Net.
 
“The problem is there is huge, duplicated effort in data collection and, a few years after the crisis, the information has disappeared,” he says.
 
HXL’s working group, consisting of representatives of various NGOs and UN and national development agencies, decided to focus on “high-value indicators” that can be compared between different organisations and have a big impact on crisis response efforts, says Megginson.
 
These will include a ‘humanitarian profile’ comprising data such as the number of people killed, injured or displaced, and how many people are without food and water.
 

“There is huge, duplicated effort in data collection and the information has disappeared.”

David Megginson, HXL

The common standard will also cover the activities during a humanitarian response, namely who is doing what and when, as well as potentially information on beneficiaries in the local population.
 
A parallel project will work towards establishing an online platform — either through a central database or a registry that catalogues information published elsewhere — designed to make data easier to share and to improve its longevity.
 
Field tests, planned for later this year in up to four unconfirmed locations, will initially focus on protracted humanitarian crises, as trialling new procedures in an unfolding disaster may impede the response, says Megginson.
 
In general, humanitarian organisations are receptive to the idea of common standards and are “eager to participate” in attempts to improve data access, he says.
 
The existing cluster system — whereby organisations responsible for the same aspect of a humanitarian response organise themselves into groups — has fostered an atmosphere of cooperation that will facilitate such efforts, he adds.
 
Michel Maietta, a research fellow at French think-tank the Institute of International and Strategic Relations, says the project should be well received given that the “fragmented and isolated” data landscape is an acknowledged problem.
 
Indeed, the recent decision by medical charity Médecins Sans Frontières tomake its field data freely available signals a shift in how humanitarian organisations regard the issue, he says.
 
If NGOs are already embracing the idea of sharing their databases, the HXL initiative can be successful, he tells SciDev.Net.
 
Nonetheless, the project faces huge challenges, he says, including working out how to respect the privacy of aid beneficiaries, create the database and get commitment from cash-strapped NGOs that lack the resources to design even their own internal standards.

 

Link to HXL initiative 

-->