Autonomous Weapon Systems: Legality under International Humanitarian Law and Human Rights

9 May 2017

Our new publication Defending the Boundary analyses the constraints and requirements on the use of autonomous weapon systems (AWS), also called ‘killer robots’, under international humanitarian law (IHL) and international human rights law (IHRL).

A Research Brief of this publication provides policy makers and advocacy groups with a summary of key findings.

Drawing on case law dealing with other weapon technologies and autonomous systems, it asks where and when AWS may be used, and what the procedural legal requirements are in terms of the planning, conduct and aftermath of AWS use.

The use of a ‘sentry-AWS’ to control a boundary, secure a perimeter or deny access to an area, for example along an international border, forms the backdrop to the legal discussion.

Defending the Boundary AWS under IHL and IHRL

A Timely Publication

‘This publication is particularly timely as the UN Convention on Certain Conventional Weapons (CCW) mandated a Group of Governmental Experts to consider the issue as views diverge on the circumstances in which it would be lawful to use an AWS and on whether additional law is required to ensure respect for the norms to safeguard humanity’ underlines Maya Brehm, the author of the report.

‘At the national level, several states are also looking at the legality of AWS, including in Switzerland where two parliamentary motions call for an international prohibition of fully autonomous weapons’ she adds.

Defending the Boundary AWS under IHL and IHRL CCW

What’s at Stake?

There is, as yet, no agreed definition of an AWS but the basic idea is that once activated, such a weapon system would, with the help of sensors and computationally intensive algorithms, detect, select and attack targets without further human intervention.

The use of such a weapon system can be expected to change how human beings exercise control over the use of force and its consequences. Human beings may no longer be able to predict who or what, specifically, is made the target of attack, or explain why a particular target was chosen by an AWS. That raises serious ethical, humanitarian, legal and security concerns.

According to leading researchers in the field of artificial intelligence and robotics, the deployment of AWS will be practically, if not legally, feasible within years.

The Need to Look Beyond IHL

The focus of scholarly inquiry into the legality of AWS has mostly been on compliance with IHL rules on the conduct of hostilities. Comparably, little attention has been given to the impact of AWS on human rights protection. This study aims to fill this gap by analyzing the constraints and requirements that IHRL places on the use of force by means of an AWS, both, in times of peace and during armed conflict, in relation to the conduct of hostilities and for law enforcement purposes.

‘IHL would never be the sole, and, often, it would not be the primary legal frame of reference to assess the legality of AWS’ stresses Maya Brehm.

Limited Scope for the Use of an AWS During Armed Conflict

As IHL permits the ‘categorical targeting’ of persons based on their status or imputed membership in a group (e.g. combatants), there is scope for the lawful use of an AWS for the conduct of hostilities.

However, to ensure that targeting rules can be applied in a manner that effectively protects the victims of war, human agents must bound every attack appropriately in spatio-temporal terms and retain sufficient control over an AWS to recognize changing circumstances and to adjust operations in a timely manner. ‘This calls for active and constant control over every individual attack’ underlines Maya Brehm.

Defending the Boundary AWS under IHL and IHRL Armed Conflict

AWS are Difficult to Reconcile with IHRL

To preserve life, the use of potentially lethal force for law enforcement purposes can only be justified if it is absolutely necessary and strictly proportionate in the specific circumstances. Lethal force cannot be applied automatically or categorically. ‘Human beings must therefore be actively and even personally involved in every instance of force application. AWS are difficult to reconcile with IHRL due to the need to individuate the use of force’, says Maya Brehm.

Furthermore, the algorithm-based targeting of security measures can be de-humanizing, objectifying and discriminatory. ‘Human involvement in such processes serves as an essential procedural safeguard to uphold human dignity and human rights’ according to Brehm.

To allow an independent assessment of the lawfulness of the use of force, ‘human agents must remain involved in targeting processes in a manner that enables them to explain the reasoning underlying algorithmic decisions in particular circumstances’ recalls Maya Brehm.

A Legal Duty to Exercise Meaningful Human Control can Help Ensure Compliance with IHL and IHRL

Whereas legal norms already regulate and limit the use of AWS, controversies and uncertainties about the applicability and meaning of existing norms diminish their capacity to serve as a guidepost. In addition, accommodating new practices within the existing legal framework bears the risk that existing rules are preserved formally but filled with a radically different meaning.

‘An explicit legal requirement to exercise meaningful human control in the use of weapons can help ensure compliance with the norms that safeguard humanity’ stresses Maya Brehm. ‘Action is urgently needed and the CCW Group of Governmental Experts is well placed to formulate such a requirement.’

MORE ON THIS THEMATIC AREA

Briefing for states at the Geneva Academy News

A Fruitful Discussion with States on the Future of United Nations Treaty Bodies

29 November 2018

Around 60 diplomats participated in the briefing which addressed the upcoming review at the General Assembly and the outcomes of our Oslo Conference, co-organized with the Norwegian Centre for Human Rights, on the meaning of this review for national stakeholders.

Read more

Group photo of particiants in the training News

A Workshop in Vietnam on How the UN Convention on the Rights of Persons with Disabilities Applies to Survivors of the Conflict

19 February 2019

Our local partner in Vietnam, the Association for Empowerment of Persons with Disabilities, hosted a one day workshop as part of our research project Disability and Armed Conflict.

Read more

Two persons walk in the ruins of Aleppo, Syria Current Issues in Armed Conflict Conference

Third Current Issues in Armed Conflicts Conference

17 June 2019, 09:00-17:00

This annual conference, co-organized with the Human Rights Centre of University of Essex, provides a space to discuss the legal and policy issues that have arisen in the past and the current year in relation to armed conflicts situations.

Read more

Nigeria, a broken plate on the ground: Maiduguri, Jiddari Polo area. Items brought by displaced persons to represent the things they left behind. IHL Talks

Starvation and Armed Conflicts

2 May 2019, 12:30-14:00

This IHL Talk will examine the prohibition of starvation under both international humanitarian law and international criminal law. It will also address starvation's humanitarian consequences.

Read more

A general view of participants during of the 33nd ordinary session of the Human Rights Council. Training

The Universal Periodic Review and the UN Human Rights System: Raising the Bar on Accountability

30 September - 4 October 2019

This training course will explore the origin and evolution of the Universal Periodic Review (UPR) and its functioning in Geneva and will focus on the nature of implementation of the UPR recommendations at the national level.

Read more

Data connections with a man's shadow on the back Project

HUMAN RIGHTS, BIG DATA AND TECHNOLOGY PROJECT

Started in May 2016

We are a partner of the Human Rights, Big Data and Technology Project, housed at the University of Essex’s Human Rights Centre, which aims to map and analyse the human rights challenges and opportunities presented by the use of big data and associated technologies. It notably examines whether fundamental human rights concepts and approaches need to be updated and adapted to meet the new realities of the digital age.

Read more

doption of the Arms Trade Treaty by the UN General Assembly, 2 April 2013 Project

The Arms Trade Treaty

Completed in January 2013

The Geneva Academy team followed the Arms Trade Treaty (ATT) negotiations and provided key information on the negotiations, notably via a daily blog.

Read more