Fully autonomous weapons, those which select and engage targets in absence of human control, give rise to moral, ethical, legal and security threats. The efforts to introduce a ban on these ‘killer robots,’ also referred to as ‘lethal autonomous weapons systems’ (LAWS) continue to grow globally. World governments, civil society groups and the global public are concerned that without swift action, humanity will cross a moral red-line.

Diplomatic discussions regarding LAWS stagnated as a result of the COVID 19 pandemic. However, meetings of the ‘Group of Governmental Experts’ who address this issue have been scheduled for 2021 but are still facing uncertainty. The Group is working towards recommending a “normative and operational framework” at the end of this year. The only policy outcome to adequately respond to this urgent issue, is a legal instrument (such as a treaty) that combines prohibitions and positive obligations to preserve meaningful human control over the use of force.

The Australian Government has continued to assert that a ban would be premature. Fully autonomous weapons pose great risk and amid rapid global advancements and Australia’s own development of autonomous weapons, hard law which articulates the need for meaningful human control is essential to establish new norms. Innovation must be done with limits and delineation of what is and is not acceptable; morally, ethically and legally. 

As a principled, global actor Australia should adopt a clear policy to maintain meaningful human control over autonomous weapons and rule out LAWS to guide its advancements regarding AI in defence. However, instead of a principled rejection of LAWS, we hear troubling comments such as, that with potential LAWS of the future there may be “broad comfort in the use of an ethical system.” This statement and other views of the Australian government were offered during a roundtable on LAWS hosted by the ANU College of Law in March and captured by the Chair’s Summary, a synthesis of the discussion which took place between government, Australian Defence Force, industry, civil society, international organisations and academia.

The Chair’s Summary highlights many of the risks seen to be ‘new and abnormal’ associated with LAWS. These include the problems of algorithmic bias, the ‘black box’ problem, opposing algorithms creating unforeseen feedback loops, catastrophic consequences of machine error, compounded by high speeds, not to mention the issues with handing over life and death decisions to machines, reinforced by many attendees.

We welcome domestic initiatives such as developing a ‘system of governance’ related to the use of these technologies. However, they must also be informed by international norms, laws and regulations. Establishing new legal rules, as the ICRC put it in their recent position statement, is crucial for the international community and a needed step to address technological advancement to ensure limits on autonomy in weapons and protection of humanity.

As the Hon Dr Andrew Leigh MP put it in his speech to the House of Representatives in Parliament earlier this year:

Australia, however, has taken a somewhat opaque approach to lethal autonomous weapons, despite working on autonomous weapon systems…It is tempting to think that new technologies can just be used by the forces of light but, as past experience has shown, we need to regulate based on the assumption that the most nefarious actors and the most dubiously moral states will use such weapons. Those are the grounds for bans on poison gas and torture, and that’s the principle on which Australia should approach lethal autonomous weapon systems. 

He called on the government to “engage more fully with the Australian people and the international community on the critical debate over the regulation of lethal autonomous weapon systems.” 

Hon. Dr Andrew Leigh’s speech Autonomous weapons systems are the modern landmines also recalled UN Secretary-General Antonio Guterres comment: 

For me there is a message that is very clear — machines that have the power and the discretion to take human lives are politically unacceptable, are morally repugnant, and should be banned by international law. 

He then acknowledged the work of the Campaign to Stop Killer Robots in Australia. Our civil society movement continues to work to raise awareness of this issue among the public, different groups and sectors of society and policymakers; highlighting the need for a new legally binding instrument and urging Australia to support the calls for negotiations. The Group of Governmental Experts’ June session was postponed but an informal online ‘exchange of views’ will take its place. Countries had been invited to provide written contributions regarding what form the ‘framework’ should take. Regrettably we can’t expect Australia to join those leading efforts in suggesting a treaty-based solution, prohibitions on LAWS or obligations around meaningful human control, despite such regulation being critical and in our own interests.

For an overview focused on Australia’s autonomous weapons development and role in diplomatic discussions to date see article from Middle Ground Issue 1 SafeGround Calls For Red Lines Amid Australia’s Autonomous Weapons Development”

If you or your organisation/group, community or members are interested in engaging with this issue, please get in touch. In 2021 we must keep the pressure building to ensure meaningful human control over weapons use, bring about an international treaty and prohibit lethal autonomous weapons which would see humanity cross moral, ethical and legal red lines. 

** This piece was written and published prior to the release of the written contributions available HERE, including Australia’s statement and joint paper, and the informal exchange of views June 28-July 22. Commentary is forthcoming regarding the latest participation of Australia and progress of the Group of Governmental Experts, with a formal session scheduled for August.


Matilda Byrne is National Coordinator of the Campaign to Stop Killer Robots – Australia based at SafeGround. The Campaign urges Australian policy makers to ensure meaningful human control over the deployment of lethal force, regarding the weaponisation of artificial intelligence within the Australian defence context and to be a leader in diplomatic disarmament efforts on the issue.

https://safeground.org.au/what-we-do/campaign-to-stop-killer-robots/