Campaigners say UK rejection of killer robots is not yet watertight
Campaign to Stop Killer Robots launched in London
(London, 23 April 2013), The UK government must make its rejection of fully autonomous weapons watertight says Article 36, a British organisation that is hosting the launch of the new international “Campaign to Stop Killer Robots”. In a paper published by Article 36 today, it explains contradictions and ambiguities in the UK policy on autonomous weapons.
Already there are systems deployed where the weapon makes the final choice of target, if this is coupled with greater autonomy of movement and operation there will be fully autonomous weapons, or “killer robots” in combat. These weapons would select and attack targets without human control.
The UK does not yet have a comprehensive policy on autonomous weapons, but has made statements in parliament and outlined its approach in a note by the Ministry of Defence. The UK has stated that “the operation of weapons systems will always be under human control” and that “no planned offensive systems are to have the capability to prosecute targets without involving a human“.
Whilst these statements are welcome and could form the basis of a strong policy to prevent the removal of humans from decisions to use lethal force, the government must provide further clarification to explain contradictions and ambiguities that could undermine the policy.
“We are seeing a slide towards greater autonomy of weapons. Moving towards full autonomy on the battlefield, where the power is given to a machine to decide who lives or dies, crosses a fundamental moral line,” said Thomas Nash, Director of Article 36.
“The UK says it will always keep human control over weapons and this is an important commitment. But unless they tell us what that means, we are at risk of sleepwalking into an acceptance of fully autonomous weapons,” said Nash. “The government needs to set out a watertight policy requiring meaningful human control over every individual attack.”
The key shortcomings of the existing UK policy are:
* The policy does not set out what is meant by human control over weapon systems.
In reviewing the acceptability of autonomous weapons systems, the UK would have to define what it means by human control. But the UK does not publish its legal reviews of weapons. A very open definition of human control might be a requirement for a human to give the order to deploy a system, but that the system is then able to select and attack individual targets on its own after being deployed. Article 36 believes that this would not constitute meaningful human control.
* The policy does not prevent the future development of fully autonomous weapons.
It is welcome that the MoD notes that such weapons are not in development and that no planned offensive systems are to have fully autonomous capability. But there is no provision to protect against the development of such weapons systems by UK or UK-based companies. The MoD joint doctrine note also leaves the door open to such weapons systems by suggesting that attacks without human assessment of the target, or a subsequent human authorization to attack, could still be legal as long as certain technological challenges can be overcome.
* The policy says that existing international law is sufficient to “regulate the use” of autonomous weapons.
There would seem to be a contradiction between UK statements that there will always be human control over weapons and UK statements that an international ban on fully autonomous weapons is unnecessary because existing IHL is sufficient to control the use of such weapons. If there will always be human control over weapons then there should be no contemplation of the use of systems that operate without human control and therefore an explicit ban would seem both appropriate and necessary.
“The UK Government says there will always be human control over weapons and existing international law is adequate. The UK MoD doctrine says fully autonomous weapons could be legal. This divergence is why clearer rules are needed.” said Richard Moyes, Managing Partner of Article 36. “We need a commitment to meaningful human control over individual attacks. It doesn’t seem a lot to ask.”
The Campaign to Stop Killer Robots is calling for a pre-emptive and comprehensive ban on the development, production, and use of fully autonomous weapons. The prohibition should be achieved through an international treaty, as well as through national laws and other measures.
See also the Article 36 policy paper “Killer Robots: UK Government Policy on Fully Autonomous Weapons” analysing UK policy on autonomous weapons
In March 2012, Article 36 became the first NGO to call for a prohibition on fully autonomous weapons. Article 36 is a founding member of the Campaign to Stop Killer Robots and is hosting the launch of the global campaign in London on 22-23 April.
UK NGOs including: Action on Armed Violence, Article 36, Amnesty International UK, Handicap International, Human Rights Watch, the International Committee for Robot Arms Control (ICRAC), the Methodist Church, Reprieve, the United Nations Association – UK and Quaker Peace and Social Witness, have sent a letter to Prime Minister David Cameron calling on the government to makes its position clear on the development and use of autonomous weapons and to work for an international treaty to ban these weapons.
Launch events in London on 23 April include:
- 10.30: News Conference at the Frontline Club, 13 Norfolk Place, London W2 1QJ
- 14.00: Visual stunt with talking robot and campaigners on Parliament Square opposite the Houses of Parliament
For more information and interviews contact:
For more information on the Campaign to Stop Killer Robots see www.stopkillerrobots.org
To arrange interviews contact Laura Boillot on +44 (0) 751 557 5175 or email@example.com