In 2021, we need urgent action to ensure meaningful human control over the deployment of lethal force, specifically in the selecting of targets (humans) and decision as to whether or not to attack. A clear, legal standard is needed which will prohibit the use of artificial intelligence (AI) in weapons which autonomously select and fire, dehumanising killing and crossing a moral red line. 

2020 was an unprecedented year, and whilst the Campaign to Stop Killer Robots continued its efforts internationally and within Australia, diplomatic progress has stalled with uncertainty concerning holding meetings, and in Australia, development of autonomy in weapons continues to advance, in absence of a commitment to maintain meaningful human control over weapons use.

In early January 2020, the Royal Australian Air Force announced their new ‘Loyal Wingman’ project,  allotted $40million, to develop three autonomous combat aircraft prototypes. This is of concern as Australia has failed to recognise limits or demarcate what is unacceptable in terms of weaponising AI. 

Certainly, the Department of Defence and Australian Defence Force point to some of the complexities. For example, at a defence sector workshop as part of a CSIRO initiative on robotics, Army Lieutenant Robin Smith cited many concerns such as cyber risks, questions of ‘trust’ in machines and “ethical issues associated with autonomy and what we will or will not automate.” Yet, whilst many countries agree that human control is the critical issue, Australia will not engage with this notion faithfully, nor acknowledge the need for humans to decide who to target and if to attack. 

The 2020 Defence Strategic Update also highlighted investment in autonomous capabilities identifying autonomous systems as an important part of emerging weaponry, and a range of capabilities to be developed. 

In November 2020, The Concept for Robotics and Autonomous Systems in Defence was published. Its approach to human control is divorced from the global understanding that is being formed and would fail to ensure true moral and legal conduct. Rather, it works to justify autonomous targeting and deployment of force in absence of meaningful human control. At the same time, research conducted at the UNSW Canberra showed ADF trainees were uncomfortable with operations alongside autonomous systems.

As for diplomatic progress,The Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS) is the dedicated UN meeting forum, within the apparatus of the Convention on Certain Conventional Weapons (CCW). There is still a lack of clarity concerning sessions throughout 2021. However, the end of this year marks an important review date for the process.  In 2020, of the two scheduled 5 day sessions only one took place. The GGE convened on 21-25 September with discussions on international humanitarian law, characterising LAWS, the human element and possible options forward. 

Australia’s disingenuous contribution to the talks mirrored their submission prior to the meeting, which detailed a complex “System of Control” whilst avoiding the crux of the issue. In the meetings, they were quick to restate their position that in light of the absence of a common definition, the discussion concerning a treaty is premature. Australia spoke in fewer than half of the discussion areas but continued to emphasise the fallible notion of control ‘over the entire lifecycle of the weapon’. 

The Australian delegation drew attention to the importance of compliance with international law, and asserted that this could be done through doctrines or manuals within rules of engagement, targeting directives and standard operating procedures. This would be a great way to implement a commitment to ensure meaningful human control but Australia is yet to recognise this clear requirement. Whereas, the majority of countries are forwarding their views and calling for negotiations, Australia is stalling. Whilst they agree to advancing discussions and building more common understandings, this falls short of the urgent action needed.

In 2021 we must keep the pressure building to ensure meaningful human control over weapons use, bring about an international treaty and prohibit lethal autonomous weapons which would see humanity  cross moral, ethical and legal red lines. 

*A more in-depth version of this article, including a summary of Campaign to Stop Killer Robots (Australia) efforts and activities, and a 2021 outlook is forthcoming in MiddleGround Issue 1, to be released later this month – Stay tuned

Follow the Campaign: twitter @ban_kr_aus Facebook: /AusBanKillerRobots

Mailing list sign up and more info: https://safeground.org.au/what-we-do/campaign-to-stop-killer-robots/