While the idea of ‘killer robots’ or lethal autonomous weapon systems (LAWS) that may select and

engage targets without meaningful human control may sound like science-fiction, the development of

these LAWS are actually closer to reality than we may believe. Just last year, prominent figures such

as Stephen Hawking, Elon Musk, Steve Wozniak and Noam Chomsky signed a letter along with over

1000 experts in artificial intelligence calling for a ban on autonomous weapons that may revolutionise

warfare and warned that the deployment of such weapon systems are “feasible within years”. 1

In 2013, the Air Force B-1 bomber test launched the prototype of the Long Range Anti-Ship Missile

(LRASM), which has the capacity to decide its target and proceed to attack the target without any

human intervention after its launch. 2 There is also the Super aEgis II, an automated turret from South

Korea, which can identify, track and destroy a moving target from four kilometres away. 3 While the

turret currently does require a human operator to unlock its firing ability through a computer system, a

senior engineer of the turret’s manufacturer has indicated that the Super aEgis II was built with an

auto-firing capacity. 4 The procedure of having a human operator to unlock the turret’s firing ability

was in fact added after requests for a safeguard to be implemented. 5

In light of the technological developments that can change what we believe to be fiction to become

non-fiction, LAWS has become a topical issue garnering attention around the world. In April this

year, 95 states along with the International Committee of the Red Cross (ICRC), various NGOs and

34 international experts participated in the third annual informal experts’ meeting of the United

Nations (UN) Convention on Certain Conventional Weapons (CCW) in Geneva. 6 One of the major

outcomes of the latest CCW meeting was the adoption of the recommendation to create a

Governmental Group of Experts (GGE) in 2017, which would transition the informal meetings into

formal diplomatic discussions at the UN. 7

During the CCW discussions, a pressing issue raised was the lack of common understanding as to

exactly what LAWS are or how they should be defined. 8 However, there appears to be a consensus

that fully autonomous lethal weapons systems do not yet exist 9 (although the emergence of weapons

such as LRASM and Super aEgis II suggests that we already have the technology to create LAWS).

As such, discussions around LAWS have been speculative about the possible developments of

LAWS. Nevertheless, more than 50 non-governmental organisations (NGOs) have called for a pre-

emptive ban on LAWS. 10

Unlike previous calls for weapon control, the call to ban LAWS is more complex as it does not focus

on a particular weapon in existence or in use, but is instead looking at the possible development of

technology in the future. This call considers whether the human decision to kill should be allowed to

be delegated to an autonomous weapon. 11 It should be noted, however, that this does not suggest that it

is impossible to impose a ban before the existence of a technology. In the past, blinding laser weapons

and all forms of human cloning were banned fundamentally on moral, ethical, and humanitarian

grounds before the respective technologies existed or were in use. 12

Nonetheless, as it was pointed out at the CCW meeting, there needs to be a clear working definition or

characterisation of what is an “autonomous weapon” before there can be meaningful discussions in

this area. 13 Indeed, since the first CCW informal experts meeting in 2014, the issue of a definition,

along with human control, responsibility and legal review of LAWS has been on the agenda of the

UN conferences. 14

The current available definitions of autonomous weapons systems (AWS) include the definition

provided by the United States Department of Defence (DOD) which describes AWS as weapon

systems that “once activated, can select and engage targets without further interventions by a human

operator”, but may also include human-supervised autonomous systems. 15 Human Rights Watch

defines them as weapons that can select and engage targets without meaningful human control. 16

While these definitions may now allow further and more robust discussions regarding LAWS and

whether they should be banned, a more detailed legal definition will have to be agreed upon in the


As political scientist Peter Singer said once in an interview, “Revolutionary technologies are game-

changers, not because they solve all problems, but because they force new questions upon us that a

generation earlier people did not imagine we would be asking ourselves, or our respective

organizations or nations.” 17 The development of weapons that may be autonomous in their activation,

selection of targets and initiation of lethal force is undoubtedly revolutionary for modern warfare. As

Singer has suggested, this new possibility brings forth questions regarding the potential moral,

psychological and legal implications as a consequence of LAWS, and now is as good a time as any to

start coming up with answers to those questions.

Mandy Lim


1 Smith, D., 2015. Stephen Hawking, Elon Musk warn of 'third revolution in warfare' with autonomous weapons.

[Online] Available at: http://www.abc.net.au/news/2015-07- 28/stephen-hawking,- elon-musk- warn-of- ai-

weapon-arms- race/6652466

2 Markoff, J., 2014. Fearing Bombs That Can Pick Whom to Kill. The New York Times [Online]

Available at: http://www.nytimes.com/2014/11/12/science/weapons-directed- by-robots- not-humans- raise-


3 Parkin, S., 2015. Killer Robots: The Soldiers That Never Sleep. BBC News [Online]

Available at: http://www.bbc.com/future/story/20150715-killer- robots-the- soldiers-that- never-sleep

4 Ibid.

5 Ibid.

6 Vilmer, Jean-Baptiste Jeangène, 2016. Autonomous Weapon Diplomacy: The Geneva Debates. Carnegie

Council Ethics and International Affairs [Online]

Available at: https://www.ethicsandinternationalaffairs.org/2016/autonomous-weapon- diplomacy-geneva-

debates/#fn-11361- 1

7 Ibid.

8 Ibid.

9 Ibid.

10 Ibid.

11 Asaro, P., 2012. On Banning Autonomous Weapon Systems: Human Rights, Automation, and the

Dehumanization of Lethal Decision-Making, in Ethics of 21st Century Military Conflict Special Issue on New

Technologies and Warfare,. International Review of the Red Cross, 94(886), pp. 687-709, 606.

12 Mary Ellen O’Connell, Banning Autonomous Killing-The Legal and Ethical Requirement That Humans

Make Near-Time Lethal Decisions, in The American Way of Bombing Changing Ethical and Legal Norms from

Flying Fortresses to Drones (Matthew Evangelista and Henry Shue editors, 2014) (Excerpted), 233.

13 Vilmer, Jean-Baptiste Jeangène, 2016. Autonomous Weapon Diplomacy: The Geneva Debates. Carnegie

Council Ethics and International Affairs [Online]

Available at: https://www.ethicsandinternationalaffairs.org/2016/autonomous-weapon- diplomacy-geneva-

debates/#fn-11361- 1

14 Ibid.

15 United States Department of Defence, Autonomy in Weapon Systems, 1, 13-14.

16 Human Rights Watch, 2016. Killer Robots: The Case for Human Control. [Online]

Available at: https://www.hrw.org/news/2016/04/11/killer-robots- case-human- control.

17 Interview with Peter W. Singer: Director of the 21st Century Defense Initiative at the Brookings Institution

(2012) International Review of the Red Cross, 94(886), pp. 467–481, 469.