Over the past three years, states have discussed the serious ethical, moral and legal issues posed by increasing autonomy in weapons systems at the Convention on Certain Conventional Weapons (CCW), in a series of informal expert meetings. Today, at the CCW’s Fifth Review Conference, they decided to take these discussions forward on a more formal footing next year, by establishing a Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS).

The UN in Geneva after the Review Conference tonight

The UN in Geneva after the Review Conference tonight

This could represent a shift towards concrete action to address the grave concerns raised by the development of LAWS: previous GGEs at the CCW have led to the negotiation of new legally binding protocols to prohibit or restrict weapons systems of concern. As a member of the Campaign to Stop Killer Robots, Article 36 urges states to use the opportunity presented by the GGE to work towards an international ban on these weapons that would operate without meaningful human control.

The GGE will be open to the participation of all states, and will meet for two weeks in 2017. It will be chaired by India, and it is expected that the chair will be looking to advance considering of this issue towards firm results.

Fundamental to the concerns raised by states and civil society around LAWS have been the issues of human control and judgment: the loss of this human element in the use of force is the key, deeply troubling aspect from which other problems raised – such as challenges to adequately applying the law or ensuring accountability – proceed. Nineteen states have now called for a ban on LAWS, with over thirty raising the issue of control or judgment in their recent interventions on the issue.

Article 36 has argued that states’ discussions on LAWS should be structured around deciding what constitutes meaningful human control over weapons systems, and determining how this can be maintained in the context of developments in technology, setting legally binding standards that prohibit any systems that would fall outside these parameters.

Such an approach enables states to talk on an equal footing about the issues of core concern – irrespective of their individual levels of technological development or knowledge in this area. It also helps to future-proof agreements made – as though technology may develop in unforeseen ways, principles and standards around control can remain constant. Concentrating on a technology based definitional exercise, which some states favour, could on the other hand easily fall foul of fast moving technological developments and risk quickly becoming irrelevant to dealing with international concerns.

Urgency is key as states move forward to discuss LAWS at the GGE next year – developments in technology must not be allowed to continue to outpace diplomacy. The humanitarian harm, and the fundamental challenge to upholding the law as a process of human deliberation, that would result from the deployment of LAWS means that states cannot simply choose to “wait and see.”

Posted in: Autonomous weapons,