Autonomous Weapon Systems: Legality under International Humanitarian Law and Human Rights

Our new publication Defending the Boundary analyses the constraints and requirements on the use of autonomous weapon systems (AWS), also called ‘killer robots’, under international humanitarian law (IHL) and international human rights law (IHRL).

A Research Brief of this publication provides policy makers and advocacy groups with a summary of key findings.

Drawing on case law dealing with other weapon technologies and autonomous systems, it asks where and when AWS may be used, and what the procedural legal requirements are in terms of the planning, conduct and aftermath of AWS use.

The use of a ‘sentry-AWS’ to control a boundary, secure a perimeter or deny access to an area, for example along an international border, forms the backdrop to the legal discussion.

Defending the Boundary AWS under IHL and IHRL

A Timely Publication

‘This publication is particularly timely as the UN Convention on Certain Conventional Weapons (CCW) mandated a Group of Governmental Experts to consider the issue as views diverge on the circumstances in which it would be lawful to use an AWS and on whether additional law is required to ensure respect for the norms to safeguard humanity’ underlines Maya Brehm, the author of the report.

‘At the national level, several states are also looking at the legality of AWS, including in Switzerland where two parliamentary motions call for an international prohibition of fully autonomous weapons’ she adds.

Defending the Boundary AWS under IHL and IHRL CCW

What’s at Stake?

There is, as yet, no agreed definition of an AWS but the basic idea is that once activated, such a weapon system would, with the help of sensors and computationally intensive algorithms, detect, select and attack targets without further human intervention.

The use of such a weapon system can be expected to change how human beings exercise control over the use of force and its consequences. Human beings may no longer be able to predict who or what, specifically, is made the target of attack, or explain why a particular target was chosen by an AWS. That raises serious ethical, humanitarian, legal and security concerns.

According to leading researchers in the field of artificial intelligence and robotics, the deployment of AWS will be practically, if not legally, feasible within years.

The Need to Look Beyond IHL

The focus of scholarly inquiry into the legality of AWS has mostly been on compliance with IHL rules on the conduct of hostilities. Comparably, little attention has been given to the impact of AWS on human rights protection. This study aims to fill this gap by analyzing the constraints and requirements that IHRL places on the use of force by means of an AWS, both, in times of peace and during armed conflict, in relation to the conduct of hostilities and for law enforcement purposes.

‘IHL would never be the sole, and, often, it would not be the primary legal frame of reference to assess the legality of AWS’ stresses Maya Brehm.

Limited Scope for the Use of an AWS During Armed Conflict

As IHL permits the ‘categorical targeting’ of persons based on their status or imputed membership in a group (e.g. combatants), there is scope for the lawful use of an AWS for the conduct of hostilities.

However, to ensure that targeting rules can be applied in a manner that effectively protects the victims of war, human agents must bound every attack appropriately in spatio-temporal terms and retain sufficient control over an AWS to recognize changing circumstances and to adjust operations in a timely manner. ‘This calls for active and constant control over every individual attack’ underlines Maya Brehm.

Defending the Boundary AWS under IHL and IHRL Armed Conflict

AWS are Difficult to Reconcile with IHRL

To preserve life, the use of potentially lethal force for law enforcement purposes can only be justified if it is absolutely necessary and strictly proportionate in the specific circumstances. Lethal force cannot be applied automatically or categorically. ‘Human beings must therefore be actively and even personally involved in every instance of force application. AWS are difficult to reconcile with IHRL due to the need to individuate the use of force’, says Maya Brehm.

Furthermore, the algorithm-based targeting of security measures can be de-humanizing, objectifying and discriminatory. ‘Human involvement in such processes serves as an essential procedural safeguard to uphold human dignity and human rights’ according to Brehm.

To allow an independent assessment of the lawfulness of the use of force, ‘human agents must remain involved in targeting processes in a manner that enables them to explain the reasoning underlying algorithmic decisions in particular circumstances’ recalls Maya Brehm.

A Legal Duty to Exercise Meaningful Human Control can Help Ensure Compliance with IHL and IHRL

Whereas legal norms already regulate and limit the use of AWS, controversies and uncertainties about the applicability and meaning of existing norms diminish their capacity to serve as a guidepost. In addition, accommodating new practices within the existing legal framework bears the risk that existing rules are preserved formally but filled with a radically different meaning.

‘An explicit legal requirement to exercise meaningful human control in the use of weapons can help ensure compliance with the norms that safeguard humanity’ stresses Maya Brehm. ‘Action is urgently needed and the CCW Group of Governmental Experts is well placed to formulate such a requirement.’

MORE ON THIS THEMATIC AREA

Vover page of the In-Brief No.7 Human Rights Obligations of Armed Non-State Actors: An Exploration of the Practice of the UN Human Rights Council  News

How the UN Human Rights Council Addresses Armed Non-State Actors: Key Challenges and Way Forward

9 February 2017

Ten years after the establishment of the UN Human Rights Council, our new publication highlights the current challenges related to the Council’s approach to armed non-state actors and proposes recommendations to better address this phenomenon.

Read more

Participants in the DEMETER research project workshop in Ethiopia News

Agricultural Commercialization, the Right to Food and Gender Equality Discussed in Addis Ababa

18 June 2018

Our two research fellows, Dr Joanna Bourke Martignoni and Dr Christophe Golay, spent a week in Ethiopia to discuss the mid-term findings of the six year research project on the relationship between agricultural and land commercialization, the right to food and gender equality (DEMETER).

Read more

Democratic Republic of the Congo, Walikale, people walk on a street during th 2011 presidential elections. Event

Human Rights and Sustainable Development Goals

26 September 2018, 18:30-20:00

This public conference provides an opportunity to discuss the contributions of UN human rights mechanisms to the monitoring of the SDGs that seek to realize ESCR and their collaboration with the High-level Political Forum on Sustainable Development.&am

Read more

South Korea nuclear missiles on display Project

Nuclear Weapons under International Law

Completed in January 2014

In Nuclear Weapons under International Law: An Overview (October 2014), the Geneva Academy and the International Law and Policy Institute summarize international law governing nuclear weapons with regard to the inter-state use of force, international humanitarian law, human rights law, disarmament law and environmental law.

Read more

Screenshot of the RULAC webpage Project

Rule of Law in Armed Conflicts (RULAC)

Started in May 2007

The Rule of Law in Armed Conflicts project (RULAC) is a unique online portal that identifies and classifies all situations of armed violence that amount to an armed conflict under international humanitarian law (IHL). It is primarily a legal reference source for a broad audience, including non-specialists, interested in issues surrounding the classification of armed conflicts under IHL.

Read more