By Matilda Byrne

In 2021, we need urgent action to ensure meaningful human control over the deployment of lethal force, specifically in the selecting of targets (humans) and decision as to whether or not to attack. A clear, legal standard is needed which will prohibit the use of artificial intelligence (AI) in weapons which autonomously select and fire, dehumanising killing and crossing a moral red line. 

In 2020, despite the unprecedented nature of the year, internationally and in Australia efforts to realise such a treaty remained strong. To begin last year’s campaigning SafeGround led an action at Parliament House in Canberra urging our MPs and Senators to join the calls for a ban on lethal autonomous weapons systems (LAWS). We had a concurrent mailing initiative and individual briefings with politicians. 

SafeGround returns to Parliament in Marchto draw attention to this issue, which is needed more than ever, in light of the continued Australian advancements in weaponising AI. The Department of Defence, the Australian Defence Force (ADF) and the Department of Foreign Affairs and Trade (DFAT) have all shown a consistent and repeated avoidance of faithful engagement with this issue. 

In early January 2020, the Royal Australian Air Force announced their new ‘Loyal Wingman’ project, allotted $40million, to develop three autonomous combat aircraft prototypes. This is of concern as Australia strives towards advancement without limits or commitments to human control. 

A workshop series for CSIRO’s ‘The Robotics Roadmap for Australia V2’ included a ‘Defence Sector’ session which illuminated new insights. Army Lieutenant Robin Smith cited many concerns such as cyber risks, questions of ‘trust’ in machines and “ethical issues associated with autonomy and what we will or will not automate.” Yet, there was no mention of demarcating what is or is not acceptable. Royal Australian Navy Commander and Lead for Autonomous Warfare Systems Paul Hornsby said in assessing an autonomous platform, “there were times when things are so busy that it is beyond human endurance or human response time, and you really want to crank up the robotics and crank up the AI and there are other times where you would draw it back.” 

This ambiguity was also heard in the 2019 Senate Estimates during questions on this issue and is troubling given the development that is taking place in Australia. The 2020 Defence Strategic Update also highlighted investment in autonomous capabilities. It stated that in Australia’s changing strategic environment, “emerging and disruptive technologies will be rapidly translated into weapons systems, including autonomous systems…reducing decision times and improving weapon precision and lethality.” 

The update identifies a range of autonomous systems which will be developed as part of Australia’s capabilities; autonomous air vehicles and aerial systems, un-crewed surface and underwater systems and autonomous land vehicles with a “coordination office for the implementation of robotics and autonomous systems across the land force.” The air domain is the other key area, where “new and existing aircraft will combine with remotely piloted and autonomous systems to provide increased lethality and survivability.” 

Amid these advancements, a clear policy requiring human control over critical functions of weapons is essential to ensure morally responsible and legal development. Defence has been conducting work on how to use AI ethically. Whilst this is a welcome endeavour, the output falls drastically short. Ethical guidelines were outlined in a new paper published this week by DSTG:“A Method for Ethical AI in Defence”. It fails to acknowledge that removing human control from the critical functions is fundamentally unethical, instead reiterating that “the point of interface through which control is achieved will vary.” 

In November 2020, The Concept for Robotics and Autonomous Systems in Defence was published. Its approach to hu- man control is divorced from the global understanding that is being formed and would fail to ensure true moral and legal conduct. Rather, it works to justify autonomous targeting and deployment of force in absence of meaningful human control. At the same time, research conducted at the UNSW Canberra showed ADF trainees were uncomfortable with operations alongside autonomous systems. 

As for the diplomatic process, currently the progress is un- certain with a lack of clarity around the timing of relevant meetings on this issue throughout 2021. The Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS) is the dedicated UN meeting forum, within the apparatus of the Convention on Certain Conventional Weapons (CCW). In 2020, of the two scheduled five-day sessions only one took place. The GGE convened on 21-25 September with discussions on international humanitarian law, characterising LAWS, the human element and possible options forward. 

Australia’s disingenuous contribution to the talks mirrored their submission prior to the meeting, which detailed a com- plex “System of Control” whilst avoiding the crux of the issue. In the meetings, they were quick to restate their position that in light of the absence of a common definition, the discussion concerning a treaty is premature. Australia spoke in fewer than half of the discussion areas but continued to emphasise the fallible notion of control ‘over the entire life- cycle of the weapon’. They drew attention to the importance of compliance with international law, and asserted that this could be done through doctrines or manuals within rules of engagement, targeting directives and standard operating procedures. This would be a great way to implement a commitment to maintaining meaningful human control, without which weapons use is ‘unlawful’ as explained by the ICRC and others. 

The Australian Government agrees that the eleven guiding principles which were adopted by countries at earlier talks, are not an end in themselves and are committed to advancing discussions, build on common understanding and reduce gaps in the divergence of views within the CCW. However, this falls short of needed action. Whilst the majority of countries are forwarding their views and calling for negotiations, Australia is deliberately stalling in place. 

With COVID 19 stalling the diplomatic talks, it is even more important that civil society highlight the imperative to act in this area. SafeGround has continued to lead the efforts in Australia with the support of many other organisations and individuals. 

SafeGround set out to capture the Australian context, through multiple sectors and dimensions in its report launched in September via a webinar. Australia Out of the Loop: why we must not delegate decision making from man to machine has been well received and was a critical addition to the landscape in Australia. SafeGround also launched a podcast series Stay in Command with episodes highlighting various experts and perspectives on the issues. They include Australian voices as well as international perspectives, and others from the global coalition of the Campaign to Stop Killer Robots. 

SafeGround in 2020 was represented at the global meeting of the campaign in February prior to COVID. In this time an event was also held at ThincLab at the University of Adelaide for both staff and students. Our engagement with the university sector was developed in 2020, with the work of Yennie Sayle, who completed an internship with SafeGround working on the campaign’s youth engagement. A new webpage was launched, an InstaLive Q&A session, a survey conducted, and actions outlined for students. Yennie also represented Australia at the Virtual Global Youth Conference on this issue and you can watch the highlights video. 

We look forward to having more events both online and in person throughout 2021, for students, organisations and the general public. If you or your organisation/group, community or members are interested in engaging with this issue, please get in touch. In 2021 we must keep the pressure building to ensure meaningful human control over weapons use, bring about an international treaty and prohibit lethal autonomous weapons which would see humanity cross moral, ethical and legal red lines.