Top Banner
CHALLENGES OF AUTONOMOUS WEAPONS    C    H    A    L    L    E    N    G    E    S    O    F    A    U    T    O    N    O    M    O    U    S    W    E    A    P    O    N    S 86 RUSI DEFENCE SYSTEMS OCTOBER 2008 Grounds for Discrimination: Autonomous R obot Weapons by Professor Noel Sharkey In modern warare it is di cult to ully protect non-combatants. For example, in attacking a warship, some non-combatants such as chaplains and medical sta may be unavoidably killed. It is also di cult when large explosives are used near civilian populations, or when missiles get misdirected. But the laws o war have a way o handling the unintentional killing o innocents. Thomas Aquinas, in the 13 th Century, developed the doctrine o Double Eect. Put crudely, it is OK to kill innocents during a confict providing that (i) you did not intend to do so, or (ii) that killing the innocents was not a means to winning, or (iii) the importance to the deence o your nation is proportionally greater than a number o civilian deaths. The modern equivalent is the Principle o Proportionality which, “… requires that the anticipated loss o lie and damage to property incidental to attacks must not be excessive in relation to the concrete and direct military advantage expected to be gained”. 8 But we may be about to unleash new weapons that could violate all o these principles. Lethal Autonomous Robots Between our and six thousand robots are currently operating on the ground i n Iraq and Aghanistan. These are mainly deployed in dull, dirty or dangerous tasks such as disrupting or exploding improvised explosive devices and surveillance in dangerous areas such as caves. There are only three armed T alon SWORDS robots made by Foster-Miller, although more Noel Sharkey is Proessor o AI and Robotics and Proessor o Public Engagement EPSRC Senior Media Fellow at the University o Shefeld. Here he examines various legal and humanitarian aspects o using autonomous weapons and the issues that these raise. The development most likely to revolutionise war in the 21st Century is the unmanned s ystem. It began with Tesla’s eorts to develop remote-controlled torpedoes in the late 19th Century, 1 but it is only in the last decade that remote controlled (also known as tele-operated) mobile machines have become commonplace in confict zones. The next big step is to take the human out o the loop with autonomous robot weapons, bringing together the latest advances in technology, navigation and articial intelligence. The US has published 25-year plans or unmanned aircrat, 2 ground vehicles, 3 and naval vehicles, 4 and a roadmap up until 2032. 5  It is vitally important that the excitement about the new technology and possibilities o risk-ree war do not mask the ethical questions that they raise. We must ensure that the evolution o unmanned weapons conorms to basic humanitarian law. Protection of Innocents One o the cornerstones o humanitarian law 6 – the laws o armed confict and the law o war, as established in the Geneva and Hague conventions and the various treaties and protocols – is the protection o innocents. This is part o the justice in the conduct o a war,  jus in bello, and is oten expressed as the principle o discrimination – only combatants/warriors are legitimate targets o attack. All others, including children, civilians, service workers and retirees, should be immune rom attack. It is not just a matter o uniorm; soldiers who are wounded, have surrendered or are mentally ill are also immune. 7  CHALLENGES OF AUTONOMOUS WEAPONS While autonomous weapons are not new, ew o the ethical, legal or operational implications have been clearly identied and solved. In this section, we look at some o the legal aspects and the tactical implications that fow rom them. The overriding need to limit collateral damage and avoid killing innocents means that the man-in-the-loop is vital, but is his increasing distance rom the battleeld a disadvantage? UCA Vs are a case in point and we look at them and the weapons they will carry in the uture. We shall return to the subject in uture issues. We may be about to unleash new weapons that could violate all o these principles
5

23 Sharkey

Apr 05, 2018

Download

Documents

Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: 23 Sharkey

 

CHALLENGES OF AUTONOMOUS WEAPONS

   C   H   A   L   L   E   N   G   E   S   O   F   A   U   T   O   N   O   M   O   U   S   W   E   A   P   O   N   S

86 RUSI DEFENCE SYSTEMS OCTOBER 2008

Grounds for Discrimination:Autonomous Robot Weaponsby Professor Noel Sharkey

In modern warare it is dicult to ully protect non-combatants.For example, in attacking a warship, some non-combatantssuch as chaplains and medical sta may be unavoidably killed.It is also dicult when large explosives are used near civilianpopulations, or when missiles get misdirected. But the lawso war have a way o handling the unintentional killing oinnocents. Thomas Aquinas, in the 13th Century, developed thedoctrine o Double Eect. Put crudely, it is OK to kill innocentsduring a confict providing that (i) you did not intend todo so, or (ii) that killing the innocents was not a means towinning, or (iii) the importance to the deence o your nation isproportionally greater than a number o civilian deaths.

The modern equivalent is the Principle o Proportionalitywhich, “… requires that the anticipated loss o lie and damageto property incidental to attacks must not be excessivein relation to the concrete and direct military advantageexpected to be gained”.8 But we may be about to unleash newweapons that could violate all o these principles.

Lethal Autonomous Robots

Between our and six thousand robots are currently operatingon the ground in Iraq and Aghanistan. These are mainlydeployed in dull, dirty or dangerous tasks such as disruptingor exploding improvised explosive devices and surveillancein dangerous areas such as caves. There are only three armedTalon SWORDS robots made by Foster-Miller, although more

Noel Sharkey is Proessor o AI and Robotics and Proessor o Public Engagement EPSRC Senior Media Fellow at theUniversity o Shefeld. Here he examines various legal and humanitarian aspects o using autonomous weapons and theissues that these raise.

The development most likely to revolutionise war in the21st Century is the unmanned system. It began with Tesla’seorts to develop remote-controlled torpedoes in the late19th Century,1 but it is only in the last decade that remotecontrolled (also known as tele-operated) mobile machineshave become commonplace in confict zones. The next bigstep is to take the human out o the loop with autonomousrobot weapons, bringing together the latest advances intechnology, navigation and articial intelligence. The UShas published 25-year plans or unmanned aircrat,2 groundvehicles,3 and naval vehicles,4 and a roadmap up until 2032.5 It is vitally important that the excitement about the newtechnology and possibilities o risk-ree war do not maskthe ethical questions that they raise. We must ensure thatthe evolution o unmanned weapons conorms to basichumanitarian law.

Protection of Innocents

One o the cornerstones o humanitarian law6 – the lawso armed confict and the law o war, as established in theGeneva and Hague conventions and the various treatiesand protocols – is the protection o innocents. This is parto the justice in the conduct o a war,  jus in bello, and isoten expressed as the principle o discrimination – onlycombatants/warriors are legitimate targets o attack. Allothers, including children, civilians, service workers andretirees, should be immune rom attack. It is not just a mattero uniorm; soldiers who are wounded, have surrendered or arementally ill are also immune.7 

CHALLENGES OF AUTONOMOUS WEAPONS

While autonomous weapons are not new, ew o the ethical, legal or operational implications have

been clearly identied and solved. In this section, we look at some o the legal aspects and the

tactical implications that fow rom them. The overriding need to limit collateral damage and avoid

killing innocents means that the man-in-the-loop is vital, but is his increasing distance rom the

battleeld a disadvantage? UCAVs are a case in point and we look at them and the weapons they

will carry in the uture. We shall return to the subject in uture issues.

We may be about to unleash newweapons that could violate all o these principles

Page 2: 23 Sharkey

 

   C   H   A   L   L   E   N   G   E   S   O   F   A   U   T   O   N   O   M   O   U   S   W   E   A   P   O   N   S

CHALLENGES OF AUTONOMOUS WEAPONS

OCTOBER 2008 RUSI DEFENCE SYSTEMS 87

are expected soon. Most o the armed robots are in the sky– semi-autonomous Unmanned Combat Air Vehicles such asthe MQ1-Predator that few some 400,000 mission hours upto the end o 2006 and has fown signicantly more since,and the more powerul MQ-9 Reapers with a payload o 14Hellre missiles. These can navigate and search out targetsbut, like the ground robots, it is a remote operator, this timethousands o miles away in the Nevada desert, who makes thenal decision about when to apply lethal orce.

There is now massive spending and plans are well underway to take the human out o the loop so that robots canoperate autonomously to locate their own targets anddestroy them without human intervention. 9 This is highon the military agenda o all the US orces: “The Navy andMarine Corps should aggressively exploit the considerablewarghting benets oered by autonomous vehicles (AVs)by acquiring operational experience with current systemsand using lessons learned rom that experience to developuture AV technologies, operational requirements, and systemsconcepts.”10 There is now a number o autonomous groundvehicles such as DARPA’s ‘Unmanned Ground Combat Vehicleand Perceptor Integration System’ otherwise known as theCrusher.11 And BAE systems recently reported that theyhave “… completed a fying trial which, or the rst time,demonstrated the coordinated control o multiple UAVsautonomously completing a series o tasks”.12

The move to autonomy is clearly required to ull the currentUS military plans. Tele-operated systems are more expensiveto manuacture and require many support personnel to runthem. One o the main goals o the Future Combat Systemsproject is to use robots as a orce multiplier so that one soldieron the battleeld can be a nexus or initiating a large-scalerobot attack rom the ground and the air. Clearly one soldiercannot operate several robots alone and it takes the soldieraway rom operational duties.

Discrimination

The ethical problem is that no autonomous robots or articialintelligence systems have the necessary skills to discriminatebetween combatants and innocents. Allowing them to makedecisions about who to kill would all oul o the undamentalethical precepts o a just war under jus in bello as enshrined inthe Geneva and Hague conventions and the various protocolsset up to protect civilians, wounded soldiers, the sick, thementally ill and captives. There are no visual or sensingsystems up to that challenge.

A computer can compute any given procedure that can bewritten down in a programming language. We could, orexample, give the robot computer an instruction such as, “Icivilian, do not shoot”. This would be ne i, and only i, therewas some way o giving the computer a clear denition owhat a civilian is. We certainly cannot get one rom the Laws

The BLU-108 parachutes to near the ground and releases our Skeet warheads [Textron Deense Systems]

Page 3: 23 Sharkey

 

CHALLENGES OF AUTONOMOUS WEAPONS

   C   H   A   L   L   E   N   G   E   S   O   F   A   U   T   O   N   O   M   O   U   S   W   E   A   P   O   N   S

88 RUSI DEFENCE SYSTEMS OCTOBER 2008

o War that could provide a machine with the necessaryinormation. The 1944 Geneva Convention requires the useo common sense, while the 1977 Protocol 1 essentiallydenes a civilian in the negative sense as someone who isnot a combatant:

A civilian is any person who does not belong to one o the•

categories o persons reerred to in Article 4 A (1), (2), (3)and (6) o the Third Convention and in Article 43 o thisProtocol. In case o doubt whether a person is a civilian,that person shall be considered to be a civilian.The civilian population comprises all persons who•

are civilians.The presence within the civilian population o individuals•

who do not come within the denition o civilians does notdeprive the population o its civilian character. 13

And even i there was a clear computational denition o acivilian, we would still need all o the relevant inormationto be made available rom the sensing apparatus. All thatis available to robots are sensors such as cameras, inraredsensors, sonars, lasers, temperature sensors and ladars etc.These may be able to tell us that something is a human,but they could not tell us much else. In the labs there aresystems that can tell someone’s acial expression or thatcan recognise aces, but they do not work on real-timemoving people.

In a conventional war where all o the combatants wore thesame clearly marked uniorms (or better yet, radio requencytags) the problems might not be much dierent rom thoseaced or conventional methods o bombardment. But thewhole point o using robot weapons is to help in warareagainst insurgents, and in these cases sensors would not helpin discrimination. This would have to be based on situational

awareness and on having a theory o mind, i.e. understandingsomeone else’s intentions and predicting their likely behaviourin a particular situation. Humans understand one another ina way that machines cannot and we don’t ully understandhow. Cues can be very subtle and there is an innite numbero circumstances where lethal orce is inappropriate. Just thinko children being orced to carry empty rifes or insurgentsburying their dead.

There is also the Principle o Proportionality and again thereis no sensing or computational capability that would allowa robot such a determination, and nor is there any knownmetric to objectively measure needless, superfuous ordisproportionate suering.14 They require human judgement.Yes, humans do make errors and can behave unethically, butthey can be held accountable. Who is to be held responsibleor the lethal mishaps o a robot? Certainly not the machineitsel. There is a long causal chain associated with robots: themanuacturer, the programmer, the designer, the departmento deence, the generals or admirals in charge o the operationand the operator.

International Guidelines

There are no current international guidelines or, or evendiscussions about, the uses o autonomous robots in warare.These are needed urgently. I there was a political will touse them then there would be no legal basis on whichto complain.15 This is especially the case i they could bereleased somewhere where there is a airly high probabilitythat they will kill a considerably greater number o enemycombatants (uniormed and non-uniormed) than innocents– i.e. the civilian death toll was not disproportionate to themilitary advantage.

In this way autonomous robots would be legally similar tosubmunitions such as the BLU-108 developed by TextronDeense Systems.16 The BLU-108 parachutes to near theground where an altitude sensor triggers a rocket that spins itupwards. It then releases our Skeet warheads at right anglesto one another. Each has a dual-mode active and passivesensor system: the passive inrared sensor detects hot targetssuch as vehicles, while the active laser sensor provides targetproling. They can hit hard targets with penetrators or destroysot targets by ragmentation.

The ethical problem is that noautonomous robots or artifcial intelligence systems havethe necessary skills todiscriminate betweencombatants and innocents

The Skeet warhead released by BLU-108 [Textron Deense Systems]

Page 4: 23 Sharkey

 

   C   H   A   L   L   E   N   G   E   S   O   F   A   U   T   O   N   O   M   O   U   S   W   E   A   P   O   N   S

CHALLENGES OF AUTONOMOUS WEAPONS

OCTOBER 2008 RUSI DEFENCE SYSTEMS 89

But the BLU-108 is not like other bombs because it has amethod o target discrimination. I it had been developedin the 1940s or 1950s there is no doubt that it would havebeen classied as a robot and even now it is debatably a ormo robot. The Skeet warheads have autonomous operationand use sensors to target their weapons. The sensors providediscrimination between hot and cold bodies o a certainheight but, like autonomous robots, they cannot discriminatebetween legitimate targets and civilians. I BLU-108s weredropped on a civilian area they would destroy buses, carsand lorries. Like conventional bombs, discrimination betweeninnocents and combatants requires accurate human targeting judgements. It is this and only this that keeps the BLU-108within humanitarian law.17 

Future Use

To use robot technology over the next 25 years in wararewould at best be like using the BLU-108 submunition –i.e. can sense a target but cannot discriminate innocentrom combatant. But the big dierence with the types oautonomous robots currently being planned and developedor aerial and ground warare is that they are not perimeter-limited like the Skeet. The BLU-108 has a ootprint o 820tall around. By way o contrast, mobile autonomous robots arelimited only by the amount o uel or battery power that theycan carry. They can potentially travel long distances and moveout o line-o-sight communication.

Imagine the potential devastation o heavily armed robots ina deep mission out o radio communication. The only humanerecourse o action is to severely restrict or ban the deploymento these new weapons until there have been internationaldiscussions about how they might pass an ‘innocentsdiscrimination test’. At the very least there should be discussionabout how to limit the range and action o autonomous robotweapons beore the inevitable prolieration.

NOTES

1 Sharkey, N. and Sharkey, A. (in press) The Electro-mechanical Robot

Beore the Computer, Journal o Mechanical Engineering Science

2 Unmanned Aircrat Systems Roadmap 2005–2030, Oce o the US

Secretary o Deense, 2005

3  Joint Robotics Program Master Plan FY2005, LSD (AT&L) Deense

Systems/ Land Warare and Munitions, 3090 Pentagon, Washington

DC 20301-3090

4  The Navy Unmanned Undersea Vehicle (UUV) Master Plan,

Department o the Navy, USA, 9 November 2004

5  Unmanned Systems Roadmap 2007–2032, US Department o Deense,

10 December 2007

6 For a more detailed discussion o humanitarian law see Schmitt, M.N.

(The Principle o Discrimination in 21st Century Warare, Yale Human

Rights and Development Law Journal 143, 1999

7 But see also Ford, John S., The Morality o Obliteration Bombing,

Theological Studies, pages 261 – 309, 1944

8 Petraeus D.H. and Amos, J.F. , Counterinsurgency, Headquarters o the

Army, Field Manual FM 3-24 MCWP 3-33.5, Section 7-30

9 Sharkey, N., Cassandra o False Prophet o Doom: AI robot and war, IEEE 

Intelligent Systems, Volume 23 No 4, 14 –17, 2008, July – August Issue

 10 Committee on Autonomous Vehicles in Support o Naval Operations

National Research Council (2005)  Autonomous Vehicles in Support o 

Naval Operations, Washington DC, The National Academies Press

11 Pentagon’s ‘Crusher’ Robot Vehicle Nearly Ready to Go, Fox News, 27

February 2008

12 United Press International, BAE Systems Tech Boosts Robot UAVs IQ,

Industry Brieng, 26 February 2008

13 Protocol 1 Additional to the Geneva Conventions, 1977 (Article 50)

14 Bugsplat sotware and its successors have been used to help calculate

the correct bomb to use to destroy a target and calculate the impact.

A human is there to decide and it is unclear how successul this

approach has been in limiting civilian casualties

15 But it seems that, regardless o treaties and agreements, any weapon

that has been developed may be used i the survival o a state is

in question. The International Court o Justice (IJC )(1996) Nuclear 

Weapons Advisory Opinion decided that it could not denitively

conclude that in every circumstance the threat or use o nuclear

weapons was axiomatically contrary to international law, see Stephens,

D. and Lewis, M.W.) The Law o Armed Confict – a Contemporary

Critique, Melbourne Journal o International Law 6, 2005

16 Thanks to Richard Moyes o Landmine Action or pointing me to

the BLU-108 and to Marian Westerberg and Robert Buckley rom

Textron Deense Systems or their careul reading and comments

on my description

17 A key eature o the BLU-108 is that it has built-in redundant

sel-destruct logic modes that largely leave battleelds clean o

unexploded warheads and thus keeps it out o the 2008 treaty

banning cluster munitions

But the whole point o using robotweapons is to help in warareagainst insurgents, and in thesecases sensors would not help in discrimination

Page 5: 23 Sharkey