INTERNATIONAL REGULATIONS ON ARTIFICIAL INTELLIGENCE IN THE MILITARY

Published date01 December 2020
AuthorLOW Yan Lin1 LLB (Singapore Management University).
Date01 December 2020

Adequate or Outdated?

The 21st century has seen many new threats to national and international security, such as terrorism. In response, states have begun investing greatly in artificial intelligence to complement their respective defence systems. Given that public international law has remained largely unchanged over the past half a century, it may appear that public international law is ill-equipped to address the advent of artificial intelligence. However, this article argues that the existing legal framework is largely adequate in regulating the future use of artificial intelligence by states. This article also proposes certain modifications to the existing framework to enhance the framework's effectiveness.

I. Introduction

1 At the border of the Gaza strip, Israel's famed Iron Dome system has proven able to detect and counter incoming enemy missiles for the past decade.2 In South Korea, the Super aEgis II surveils the demilitarised zone between North Korea and South Korea and can decide whether to shoot intruders.3 In short, states have already been employing intelligent machines along their borders. These machines can function without any human supervision as they are equipped with artificial intelligence (“AI”): the science allowing computer systems to perform tasks which

normally require human intelligence.4 However, given the limitations of current technology, these machines still need to be manned by human operators who can either override the system's firing decision or make the final command to fire;5 as far as public knowledge is concerned, the technology allowing states to deploy fully autonomous weapons with confidence does not yet exist. United Nations (“UN”) Special Rapporteur Christof Heyns has defined fully autonomous weapons as those which can select and engage targets without further intervention by a human operator.6 As more states continue to invest in the use of advanced AI in their militaries,7 it is a matter of time before fully autonomous weapons are deployed.

2 Many have argued that current public international law, typically grounded in consent and consensus amongst the international community, does not allow for the deployment of fully autonomous weapons. The 2018 Group of Governmental Experts Meeting on Lethal Autonomous Weapons, for instance, was indicative of a general distrust of fully autonomous weapons – numerous states had stressed at the meeting the need for “human control, supervision, oversight, or judgment [over] the use of force” by autonomous weapons.8 Moreover, in the last two years, the European Union and 30 other states have also called for a treaty ban on fully autonomous weapons.9 However, this article argues that current public international law rules can accommodate the introduction of fully autonomous weapons and can set the parameters for exactly which types of fully autonomous weapons are permitted.

3 As aptly put by Judge Lachs, “the great acceleration of social and economic change, combined with that of science and technology, have confronted law with a serious challenge: one it must meet, lest it lag

even farther behind events than it has been wont to do”.10 Since public international law develops very slowly,11 it is accepted that there are currently no rules specifically dealing with the use of militaristic AI by states. Yet, this does not necessarily mean that a “serious challenge” in the form of a lacuna in public international law has emerged.

4 This article shall focus on AI used in military robots in times of armed conflict, one type of which exists where there is “protracted armed violence between governmental authorities and organised armed groups within a State”.12 As the applicable law in times of armed conflict is international humanitarian law (“IHL”),13 Part II14 discusses whether robot can be regulated by the basic rules of IHL. As these basic rules of IHL are applied when states carry out their domestic weapons reviews, Part III15 evaluates the effectiveness of this weapons review obligation. After passing the weapons review, the military robot will be deployed. Part IV16 thus recommends specific ways to increase both state and individual accountability after deployment. Part V17 concludes.

II. Complying with international humanitarian law

5 Assume that an armed conflict of sufficient intensity has arisen in State A. Over the past year, a group of religious fanatics from State A had morphed into a terrorist organisation called the “X Militias”. With a military-like command structure and a huge amount of resources, the X Militias have managed to procure a considerable stockpile of weapons. These militias have also begun threatening the lives of State A's citizens. To subdue the X Militias, State A has deployed its armed forces against them.

6 As a result of their wealth and technological expertise, State A has successfully incorporated AI into its military robots. Three types of these robots have received great international attention – Type 1, Type 2, and Type 3 robots. Type 1 robots can target enemy objects without human involvement. During a counter-insurgency operation in one of State A's urban slums, State A's soldiers received news that armed rebels may be hiding inside a civilian household. The soldiers, together with a Type 1 attack robot, approach the house. Two men in the house are carrying kirpan daggers for purely religious reasons. Unbeknownst to the soldiers, no insurgents are present. Due to the commotion caused by the soldiers outside the house, the family's dogs race outside and start barking. The two men run outside and start shouting to get the dogs to calm down. At first glance, the men are two quickly approaching targets carrying weapons and running outside the house in an agitated manner. Would the robot be able to distinguish a religious artefact from a weapon and properly interpret the situation? If not, will this robot be deemed unlawful under IHL?

7 Separately, State A's Type 2 robots incorporate nanotechnology and can fly. This “flying nanobot” can enter a victim's nose or mouth and subsequently kill the victim with a micro-explosion in the lungs. When the explosion occurs, the victim will experience prolonged excruciating pain due to the damaged blood vessels in the lungs. Can such a robot ever be deployed by State A?

8 As for State A's Type 3 robots, through analysing large amounts of data and continuous self-learning, they are able to advise on the conduct of warfare (that is, choose a route to reach the target area, decide whether to deploy weapons, and, if so, decide which weapon system to deploy). Thus far, these robots have given effective advice which allowed State A to take over an X Militia command centre. However, the advice has also led to incidental civilian harm. When giving advice, is it possible for these robots to adhere to the existing guidelines for warfare under IHL?

9 As fully autonomous weapons do not yet exist, the above hypothetical scenario will be used to illustrate the perceived gaps in public international law. One must not forget that “war is distinguishable from murder and massacre only when restrictions are established on the reach of battle”.18 IHL lays out these restrictions on states and individuals and seeks to minimise suffering during wartime.19 The International Court of Justice (“ICJ”) has also made clear that in evaluating the legality

of new weapons the applicable law is the fundamental principles of IHL.20 Accordingly, in order to establish that current IHL rules are sufficient to regulate fully autonomous weapons, this part will examine the applicability of (a) the rule of distinction; (b) the rule against unnecessary suffering; (c) the rule of proportionality and (d) the Martens Clause.
A. Distinction

10 This article argues that the rule of distinction is capable of functioning as a standard for military deployment of robots. As codified in the Protocol Additional to the Geneva Conventions of 12 August 1949 and relating to the Protection of Victims of International Armed Conflicts21 (“AP I”), distinction requires that parties to an armed conflict distinguish between combatants and civilians, as well as between military and civilian objects.22 However, if civilians “take a direct part in hostilities”, such as by transporting combatants to the fight, they can be targeted as well.23 This issue of distinction is especially pertinent in situations of modern warfare as combatants rarely wear uniforms and seek to blend in with the civilian population.24

11 In the above hypothetical situation, it is uncertain whether State A's Type 1 robots would be able to identify the men carrying the kirpan daggers as non-combatants. What is more certain is that, in that scenario, the human soldiers would be able to identify those men as civilians and be guided by the rule of distinction. This illustrates why several commentators, including a UN Special Rapporteur, have argued that fully autonomous weapons should not be trusted to carry out targeting activities.25 However, the necessary conclusion reached by

these commentators is that all fully autonomous weapons would never be able to adhere to the principle of distinction. With respect, this article argues that a broad-brush approach should not be taken. Instead, fully autonomous weapons should be evaluated based on their technical competence.

12 Fully autonomous weapons can be specially designed to enable them to adhere to the rule of distinction. For example, automated target recognition is already a feature in modern weapons (which are not fully autonomous). Such a function may be used to identify, acquire, track, cue, or prioritise targets for a human operator. This target recognition technology is being used in automated sentry guns, such as those in the demilitarised zone between South Korea and North Korea;26 and sensor–fused munitions, such as Sweden's BONUS System.27 While humans deploy these...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT