Our award-winning reporting has moved

Context provides news and analysis on three of the world’s most critical issues:

climate change, the impact of technology on society, and inclusive economies.

Red Cross sounds alarm over use of 'killer robots' in future wars

by Nita Bhalla | @nitabhalla | Thomson Reuters Foundation
Tuesday, 22 January 2019 05:01 GMT

ARCHIVE PHOTO: A robot is pictured in front of the Houses of Parliament and Westminster Abbey as part of the Campaign to Stop Killer Robots in London April 23, 2013. REUTERS/Luke MacGregor

Image Caption and Rights Information

With rapid advancements in artificial intelligence, there are fears among humanitarians over its use to develop machines which can independently make the decision about who to kill

By Nita Bhalla

NAIROBI, Jan 22 (Thomson Reuters Foundation) - Countries must agree strict rules on "killer robots" - autonomous weapons which can assassinate without human involvement, a top Red Cross official has said, amid growing ethical concerns over their use in future wars.

Semi-autonomous weapons systems from drones to tanks have for decades been used to eliminate targets in modern day warfare - but they all have human control behind them.

With rapid advancements in artificial intelligence, there are fears among humanitarians over its use to develop machines which can independently make the decision about who to kill.

Yves Daccord, director-general of the International Committee of the Red Cross (ICRC), said this would be a critical issue in the coming years as it raised ethical questions on delegating lethal decisions to machines and accountability.

"We will have weapons which fly without being remotely managed by a human and have enough intelligence to locate a target and decide whether it is the right person to take out," Daccord told the Thomson Reuters Foundation in an interview.

"There will be no human making that decision, it will be the machine deciding - the world will essentially be delegating responsibility to an algorithm to decide who is the enemy and who is not, and who gets to live and who gets to die."

The ICRC initiated the international adoption of the four Geneva Conventions that lie at the core of international humanitarian law in 1949.

Since then, it has urged governments to adapt international humanitarian law to changing circumstances, in particular to modern developments in warfare, so as to provide more effective protection and assistance for conflict victims.


A global survey published by Human Rights Watch and the Campaign to Stop Killer Robots, a global coalition of NGOs, on Tuesday found six out of ten people polled across 26 countries oppose the development of fully autonomous lethal weapons.

The study, conducted by Ipsos, surveyed 18,795 people in 26 countries including Brazil, India, the United States, Britain, China, South Africa, Japan and Israel.

Daccord said autonomous weapons crossed a moral threshold as machines did not have the human characteristics such as compassion necessary to make complex ethical decisions.

They lacked human judgment to evaluate whether an attack was a proportional response; distinguish civilians from combatants, and abide by core principles of international humanitarian law, he added.

The issue of "killer robots" has divided humanitarians.

The United Nations Secretary-General Antonio Guterres has called for a complete ban, while other organisations such as the ICRC are advocating for strict regulation.

"We should not go for banning, but I am of the opinion that we have to keep a level of human control over such weapons. This means that, at any time of the operation, a human can intervene," said Daccord.

"There are no guidelines regarding their use and they have not even been defined yet, so we have to create a common grammar between states and develop guidelines, or treaty law."

The rules would address issues such as the definition of autonomous weapons, the level of human supervision over these weapons such as ability to intervene and deactivate, as well as the operational conditions for their use, says the ICRC.

Supporters of autonomous weapons argue they will make war more humane. They will be more precise in determining and eliminating targets, not fall prey to human emotions such as fear or vengeance and will minimise civilian deaths, they say.

But Daccord said such machines could malfunction, and this raised questions over who would be held responsible.

"You can hold people accountable under international humanitarian law with remotely managed weapons such as drones. With autonomous weapons, we are moving into new territory," he said.

"There is a process under way, but we have to get countries together to agree on a common text which is not easy. It's better they start to negotiate now and find an agreement than wait for a major disaster."

(Reporting by Nita Bhalla @nitabhalla, Editing by Claire Cozens. Please credit the Thomson Reuters Foundation, the charitable arm of Thomson Reuters, that covers humanitarian news, women's and LGBT+ rights, human trafficking, property rights and climate change. Visit http://news.trust.org)

Our Standards: The Thomson Reuters Trust Principles.