Conference room at the multilateral expert discussions on lethal autonomous weapons systems at the UN in Geneva, April 2015 (Article 36)

The conference room at the multilateral expert discussions on lethal autonomous weapons systems at the UN in Geneva, April 2015 (Article 36)

Multilateral expert discussions in Geneva on ‘Lethal Autonomous Weapons Systems’ are taking place within the framework of the Convention on Certain Conventional Weapons (CCW) in Geneva from 13-17 April. Article 36 has given three interventions to the meeting, which are reproduced below.

Our statement on 13 April to the general debate called on states to take the opportunity to set a clear direction of work that responds to the challenges raised by the increased autonomy in weapons systems. The CCW should establish the key principles from which this issue should be approached and then draw the boundaries of the necessary prohibitions. This should proceed from the starting point of the need for meaningful human control over individual attacks.

In our intervention on 15 April to a session on characteristics of ‘lethal autonomous weapons systems’, we responded to some aspects of how the principle of meaningful human control over individual attacks has been treated by delegations at the meeting so far.

Our remarks on 15 April to a session on ‘possible challenges to IHL due to increasing degrees of autonomy’ we addressed IHL rules and the adequacy of weapons reviews to address the issue of increasing autonomy.

Our statement on 17 April to the ‘way ahead’ session recommended that states hold focused discussions to explore meaningful human control in greater depth, given that ‘human control’ or ‘human judgement’ over the operation of weapons systems has been the main point of convergence this week. We expect that this would lead to the conclusion that a prohibition is required on autonomous weapons systems

For PDF downloads and related materials, see ‘read more‘.

 

Monday, 13 April 2015

Mr Chair,

This week, we hope that states will take the opportunity to set a clear direction of work that responds to the challenges raised by the increased autonomy being developed in weapons systems.

It should be possible here at the CCW to identify some key principles that can set discussions in a positive direction. In our view, building consensus around such principles would help to build common ground and a direction for work, and avoid potentially tangled debates about the technology or legality of hypothetical future weapons systems.

Laura Boillot gives our statement to the general debate (Article 36)

Laura Boillot gives our statement to the general debate (Article 36)

No delegation has argued that autonomous weapons should be allowed to operate without human control, or with human control that is devoid of meaning. So this meeting should be used to build agreement that meaningful human control is necessary in the use of weapons. Some states may argue that this needs to be defined before it can be accepted – that it is too vague. We would invite such states, therefore, to push for this theme to be the subject of focused work in future CCW discussions.

International humanitarian law (IHL) is one important legal framework through which autonomous weapons can be approached. However, if discussion is too focused on undefined hypothetical systems then legal arguments can become separated from reality.

In particular, we should be wary of legal discussions that forget that the law is a human framework, addressed to humans. Processes of calculation and computation in a machine are not equivalent to deliberative human reasoning within a social framework. Machines do not make “legal judgements” and “apply legal rules”.

The basic IHL rules on the conduct of hostilities are to be applied, by humans, on an attack-by-attack basis, taking into account the specific circumstances of each attack.

Increasing autonomy in weapons systems risks expanding the notion of an attack in ways that undermine the exercise of meaningful human control. This raises a number of important questions to which the CCW needs urgently to apply itself if it is to establish a process of work that gets to the heart of the matter.

Mr Chairperson

Recognising a requirement for meaningful human control over individual attacks is likely to lead towards the prohibition of certain weapons systems, or certain uses of weapons.

If we can understand that arguing in favour of weapons that operate without meaningful human control is morally and legally untenable, it is relatively straightforward to conclude that fully autonomous weapons, those that do not allow meaningful human control, should be prohibited.

The CCW should establish the key principles from which this issue should be approached and then draw the boundaries of the necessary prohibitions. The international community will at some point legislate on this issue, because the moral questions it poses are too fundamental to ignore.

Given that the CCW is specifically designed to produce new prohibitions and restrictions on weapons, we don’t think it should be seen as premature that this should be the direction of travel with regard to an outcome of these discussions.

We therefore welcome the many statements by a number of countries, as well as the useful background paper that you circulated, in which the principle of meaningful human control is recognised as a central concern.

The CCW’s fifth Review Conference scheduled for November 2016 is an important marker on the landscape. States should consider what might be achieved at that meeting, not only in agreeing a mandate to negotiate new rules on this issue, but also in terms of collectively stating, at that point, a recognition of the key principles from which autonomous weapons can effectively be addressed.

Lastly, we would like to bring your attention to a new publication that we have produced for this week entitled ‘Killing by machine: key issues for understanding meaningful human control’, these are available at the back of the room and online.

We would very much welcome any views on this and we look forward to an active exchange on the many issues raised by autonomous weapons systems over the course of this week.

Thank you, Mr. Chair.

 

Wednesday, 15 April

Remarks to session on ‘Characteristics of LAWS’

Thank you Mr Chairperson,

I just wanted to make some brief remarks about meaningful human control, which, as others have noted, does seem to have some currency in the CCW debate on autonomous weapons.

We think about meaningful human control as a way of structuring debate around the problems that we see with the potential development of autonomous weapons.

We certainly do not see it as a framework for developing fully autonomous weapons.

On the contrary we think it will be very difficult to show that meaningful human control can be ensured over the use of fully autonomous systems.

Our starting point would be that if we cannot demonstrate meaningful human control over a weapon system then we should not develop or use it.

On that basis, the concept is likely to be quite helpful for developing normative guidance, as some of the presenters said yesterday, or as we would see it, an international prohibition on fully autonomous weapons.

So the question of whether you see meaningful human control as a useful concept probably depends on whether you want to develop fully autonomous weapons or not.

In that regard, we welcome the UK and France saying yesterday that they do not intend to acquire fully autonomous weapons / autonomous weapons that deploy fire.

Thank you Mr Chairperson.

 

Remarks to session on ‘Possible challenges to IHL due to increasing degrees of autonomy’

Thank you Chairperson and thank you to the presenters.

International humanitarian law (IHL) is one important legal framework through which autonomous weapons can be approached.

In particular, we should be wary of legal discussions that forget that the law is a human framework, addressed to humans. Processes of calculation and computation in a machine are not equivalent to deliberative human reasoning within a social framework. Machines do not make “legal judgements” and “apply legal rules”.

The basic IHL rules on the conduct of hostilities are to be applied, by humans, on an attack-by-attack basis, taking into account the specific circumstances of each attack.

Whilst attacks are defined as acts of violence (whether in offence or defence) and can comprise several instances of force application, an attack must be bounded in time and space. Increasing autonomy in weapons systems risks expanding the notion of an attack in ways that undermine the exercise of meaningful human control. In the context of weapons that can detect and engage target objects, less control can be exercised by the weapon user over the effects of the weapon if:

  • It operates for a longer time;
  • It operates over a wider area;
  • It uses broader proxy indicators; and
  • If it is used in an environment where there are a greater number of persons and objects that match those parameters (for example, in what some states have called a cluttered environment).

An attack must therefore be sufficiently contained (geographically and in time) to allow a commander to acquire and assess the contextual information necessary to make informed judgments about the military utility, necessity, risk to civilians, moral acceptability, and legality of the proposed use of force.

The CCW needs urgently to apply itself to these questions if it is to establish a process of work that gets to the heart of the matter.

Madame Chairperson,

Thomas Nash gives our statement to the session on ‘Possible challenges to IHL due to increasing degrees of autonomy’ (Article 36)

Thomas Nash gives our statement to the session on ‘Possible challenges to IHL due to increasing degrees of autonomy’ (Article 36)

In relation to weapons review:

We would agree with the ICRC that there are too many questions related to AWS to leave it up to national legal reviews of weapons to address them.

Application of these reviews is already weak: The level of compliance is low; transparency is lacking; there is no standardisation; there is a varying interpretation of existing international legal rules; and there is no clarity about how to assess the necessary form or level of human control / human judgement needed in order to ensure a weapon is legal.

We have some specific questions on weapon reviews:

– States are obligated to conduct a legal weapons review against the provisions of Additional Protocol I or other rules of international law applicable to the State Party. It was unclear to us what role is being envisaged for human rights law within the ‘weapons review criteria’. Could you and other presenters explain how human rights law could be relevant in this regard?

– If a new weapon must be assessed in terms of its ‘normal, intended circumstances of use’, and a weapons reviewer does not concern herself with the legality of individual attacks, then what should come out of a weapons review in terms of normative guidance to one’s own forces, to ensure that the new weapon will in fact only be used ‘as intended’? For example, this might be important if a weapon was only intended for use as a defensive system.

In relation to the recommendation that we should take a legal pause and allow the development of AWS:

– If we allow the development of such systems, is that really a pause? Or would it rather represent an active decision to pursue AWS as weapons that some perceive as legitimate? Can we think of situations in which states have allowed the development of certain weapons systems (during a “legal pause”) and have then subsequently subjected them to regulation or prohibition due to concerns articulated by international community?

– It is not uncommon for states to argue that now that we have these systems and they have not been prohibited, they are therefore “legal”. This, in our experience, has been a standard argument used by many states opposed to efforts to prohibit weapons that have caused significant civilian harm or run counter to the laws of humanity and the dictates of the public conscience.

So we would agree with those who have advised caution in these legal discussions about prematurely legitimising autonomous weapons systems:

– Do we want autonomous systems that can deploy fire or release munitions on their own, without a human being pulling the trigger or pressing the button? Yesterday the UK and France seemed to say no. Other states have also emphatically rejected this prospect. So if such systems are not going to be developed, we should be clear that it is not about applying international law to them. Rather it is about codifying their prohibition under international law, as the drafters of the CCW envisaged when they reaffirmed the “need to continue the codification and progressive development of the rules of international law applicable in armed conflict.”

Thank you Madame Chairperson.

 

Friday, 17 April

Statement to session on ‘way ahead’

Mr. Chairperson,

The expert presentations and discussions this week have explored a broad range of aspects related to autonomous weapons systems, and the serious concerns such weapons systems would present.

The most fundamental being the moral and ethical concerns around delegating decisions over the use of force to machines, and as Christof Heyns noted, the affront to human dignity that this entails, whether in armed conflict or domestic policing.

Our main observation on the week is that there is a strong sense that we should not be going down the road of developing autonomous weapons systems.

The concerns raised this week by many actors should provide a barrier to the development of autonomous weapons systems, but more definitive action will be required to ensure this is the case.

We believe that there should be an explicit requirement for meaningful human control over the operation of weapons systems, and specifically over every individual attack.

Laura Boillot delivers our final statement to the session on the 'Way Ahead' (Article 36)

Laura Boillot delivers our final statement to the session on the ‘Way Ahead’ (Article 36)

This week we have not heard any state argue that weapons systems should be allowed to operate without human control, or with human control that is devoid of meaning.

In fact, ‘human control’ or ‘human judgement’ over the operation of weapons systems has been the main point of convergence among states this week.

Convergence around this principle, and indeed views that the concept of meaningful human control is too vague, should be taken as an opportunity to hold focused discussions to explore it in greater depth, including how meaningful human control is ensured over existing weapons systems.

If states are able to provide answers on how meaningful human control is ensured over weapons systems, then we expect that this will lead to the conclusion that a prohibition is required on fully autonomous weapons systems.

Mr Chair,

Given the richness of the debate this week, as well as the extensive interest from media and parliamentarians outside this room, it seems to us that the only responsible course of action is to initiate a process of international meetings dedicated to developing a prohibition on fully autonomous weapons systems on the basis that they do not allow for meaningful human control over every individual attack.

States should urgently begin this work. The upcoming CCW Review Conference next year would be an appropriate milestone to ensure we are well on track to developing such a prohibition. The Human Rights Council would be another relevant forum to hold further discussion on this issue as well.

On transparency – Whilst a focus on transparency in relation to developments on autonomous weapons systems is certainly useful, the extent that this can be seen as a robust response to the problem posed by autonomous weapons systems can only be measured by the amount of transparency we see from states.

In relation to article 36 weapons reviews and processes, we are strongly supportive of efforts to increase scrutiny over the development and use of weapons, means and methods of warfare, including in the context of legal reviews under additional protocol I.

We think this week’s discussion clearly shows that the legality, morality and desirability of pursuing autonomous weapons is highly contested. Different states will therefore have widely varying approaches to legal reviews of such systems.

Against that background we think that rather than a specific discussion in relation to weapons reviews of autonomous weapons systems, it would be more sensible to start with a discussion on the broader practice of legal reviews of weapons, means and methods of warfare.

This course of action has previously been recommended by a number of delegations at the CCW, including before the CCW took up work on autonomous weapons. We look forward to contributing to specific CCW work on the wider issue of legal reviews.

Thank you.

 

Read more

Download our statement to the general debate as a PDF

Download our intervention to the session on characteristics of ‘lethal autonomous weapons systems’ as a PDF

Download our remarks to the session on ‘possible challenges to IHL due to increasing degrees of autonomy’ as a PDF

Download our statement on the ‘way ahead’ as a PDF

Killing by MachineKilling by machine: Key issues for understanding meaningful human control

April 2015
Fold out chart

 

Losing control over the use of force: Fully autonomous weapons systems and the international movement to ban them

UK should be at the forefront of international discussions to stop killer robot development

 

Posted in: Autonomous weapons, Statements, Weapons review,