John Rodsted interviewed Paul Barratt last year about his views on lethal autonomous weapons. Paul Barratt has had a long career in Australia’s Public Service since 1966, but what distinguishes him from many others within government and the public sector is his strong conscience. He has held many senior roles within government, notably within the Department of Foreign Affairs and Trade, Industry, Science, Energy and Resources as well the Business Council of Australia. He was Secretary of the Department of Defence from 1998 to 1999. It was his senior role in the Department of Defence that put him at odds with government positions and policy. This led him to leave the Public Service. Since then, he has been very vocal on how and why Australia goes to war and that a few have the power to commit Australia to war. He is also one of the founders and current President of Australians for War Power Reform.
An Insider To Policy And Decision Making
John Rodsted: When you entered the Department of Defence in ‘66, it was in the early days of Australia entering the Vietnam war. You were in the Department of Defence during the war. How were Australia’s policies and actions shaped then, and by whom?
The policy to go into Vietnam was shaped very much by the Prime Minister, Menzies, himself. I was in the fortunate position of being just one year too old to be called up in the first draft for Vietnam, but some of my university friends were conscripted and sent off to fight in a war in which we should never have participated.
John Rodsted: As a Public Service insider, you became privy to how decisions were made. This was not always a fair and honourable process. What kind of policy was Australia driven toward?
Paul Barratt: If we stick to the Defence domain, quite often the real discussion wasn’t around whether or not we should be involved in a war, but how we would get involved. The Prime Minister would make a decision that Australia should fight alongside our American allies. Then the first thing that would come to Cabinet would be, what form will this assis- tance take? There is too much power and too few hands at the beginning.
John Rodsted: Would it come down to the US effectively insisting that we entered a war supporting them; and as long as the Prime Minister agreed to that, then, we were committed?
Paul Barratt: Actually, it’s worse than that John. More often our Government would insist on participating in a war to which Australia was not invited. That was very much the case with Vietnam. Our Government persuaded the US that Australia should be involved. The US military was not particularly enthusiastic because they find it easier to fight alone and feel that they have the capability to do so. That turned out to be wrong in most cases, but they feel they can do it. The American political system likes to have some extra flags on the poles to show they are involved in a major coalition. The same thing happened with Iraq and Afghanistan. John Howard volunteered Australia into those wars. The Americans didn’t ask us.
John Rodsted: So with any dissent that may be either within government or within Parliament, how are those voices then heard?
Paul Barratt: With great difficulty. There is unlikely to be dissent within the government when the threshold decision has already been made. Backbenchers feel that if Australia is going to war, their job is to support the government and support the troops in the field. When the first contingents went off to Iraq, Opposition leader Simon Crean, whose party was opposed to the war, took great care to distinguish between being opposed to the war and wishing the troops well. In other words: We support our troops in harm’s way, but we don’t think we ought to be there. That’s a pretty difficult thing to navigate.
As for Parliament, that depends on whether the government permits the matter to be debated at all. We committed ourselves to Afghanistan in 2001, and the very first parliamentary debate on the war was in Julia Gillard’s time.
The Australians For War Powers Reform
John Rodsted: You are a strong advocate for changes on how we go to war. You helped form and chair the Australians for War Powers Reform. What do you want to see change?
Paul Barratt: The organization had its origins in the 2012 Campaign for an inquiry into the Iraq war. Our first objective was to get something like the Chilcot inquiry going (named after Sir John Chilcot who was the chairman of the UK inquiry); to find out how decisions [to go to war] were made and what could be learned from that process. But the real aim was to use this as a case study on why the power to deploy the ADF into international armed conflict ought to be relocated in the Parliament. We knew a lot about how the decisions had been made and were able to infer more by research. By putting various bits and pieces together, we hoped to achieve an open public inquiry, which would demonstrate that our decision-making processes were flawed and that it was too dangerous to leave it in the hands of a small number of people.
What we want is the power to send Australian troops into armed conflict to be located in the Parliament. It should be a decision only taken when the Parliament and preferably both Houses, have given consent.
John Rodsted: If you take the decision away from the Prime Minister, remove the so-called captain’s call, wouldn’t it take too long to respond to any threats in a real timeframe?
Paul Barratt: No, that’s a great misapprehension. Most of the Australian Defence Force quite rightly is held in a relatively low state of readiness. Troops have been training and doing practice manoeuvres, but to get your equipment into a fighting state requires a lot of preparation. For example, when we went to Timor, Admiral Barry and I advised the National Security Committee of Cabinet, in February 1999, that we ought to get ready to deploy to Timor. The plebiscite was looming, and we could see there might be a breakdown in the situation there. They were finally ready to deploy in September. It took seven months and the expenditure of almost $300 million to get everything operational and to get Commanders ready for operations in the field. We have a ready reaction force in Townsville, which is basically a battalion and associated elements. I would be quite happy to have a framework in which anything that the ready reaction force could handle could be done without the decision of the government because that would be an emergency type situation. But anything that required a larger deployment, ought to be debated and authorised in Parliament.
John Rodsted: If the decision had to go through Parliament, couldn’t it get held up by minor parties or in the Senate, just people being divisive because they can, playing politics with the decision?
Paul Barratt: That’s an argument we often hear. If there was a genuine threat and the major opposition party agreed with the government, the minor parties would have no role at all. So that concern is just that it might make it difficult for the government to engage in wars of choice. And of course, that is the whole point.
John Rodsted: And I suppose that separates it perfectly between threat and adventure. One: the Parliament is going to respond for a real threat against Australia and Australia’s interests. The other is getting involved in an adventure that has nothing to do with us. That would be the difference.
Paul Barratt: To put it brutally, I would say to whoever is in government, if you can’t persuade the opposition that our national security is threatened, we ought not to go.
John Rodsted: If the party that was in power at the time had access to secret intelligence that they could not talk about, how would they deal with this?
Paul Barratt: There are a couple of ways you could deal with that. It is an argument we often hear and sometimes it’s a bit hard to keep a straight face. When we reflect on the ‘Weapons of Mass Destruction’ issue in Iraq, it turns out they did not exist. Everybody knew they did not exist. Hans Blix, United Nations weapons inspector, certainly knew they did not exist.
But let’s take your question at face value. There are a couple of things you could do. In any national security situation, the Government will brief the leader of the Opposition, in private, and in secret. That happened in relation to operations in Syria. Additionally, there could be a proper national security intelligence committee in the Parliament, in which members of the committee were security cleared to receive all the information available. That way all parties are involved in looking at the available evidence. They could go into the Parliament and say: We have seen the intelligence and we are convinced.
It’s rare that secret intelligence is the only thing you have. Very often there is information in the public domain as well. In fact, I think most intelligence agencies should devote more effort to analysing what is in the public domain because you can learn a lot from that. An option always available to the Government would be to say: “Here is what you’re seeing in the public domain, our secret intelligence bears out what we’ve concluded from the open-source material. If there is a will you can certainly find a way to navigate through that real difficulty of how to handle secret intelligence.
John Rodsted: The secret intelligence effectively just becomes a confirmation of what is a greater information stream. What kind of support have you had for your organization’s aims and ideals and where should it go from here?
Paul Barratt: We’ve had support from various members of political parties and a lot of public support. The most tangible support we’ve had from a political party is a resolution passed on the floor of the ALPs national Congress in 2018, in Adelaide, that an incoming Labor government would establish an open public parliamentary inquiry into how we go to war. I think that was a very positive step. They are not pre-committing themselves to change the way we go to war, but they are committing themselves to establishing the facts. It would give those who are seeking a change, the opportunity to put their case. It would mean addressing the arguments we’ve just dis- cussed and defending them in an open forum. So we would end up with a more honest debate.
Another argument used against us, is that it really would not make any difference because everyone would just vote on party lines. However, in such a parliamentary inquiry, I think being asked to take responsibility for something that would involve death and destruction on both sides; for putting the young men and women of the ADF in harm’s way and for inevitably involving civilian casualties, you would end up with a conscience vote. I do not think you can assume that everybody would vote on party lines. If we have a parliamentary inquiry, we can tease all these arguments out.
I’d like to see that commitment part of the ALP platform, and I very much hope that any incoming Labor government would proceed along those lines. Our movement would like to persuade all major political parties that this is a desirable change.
John Rodsted: You’ve had some pretty good support from some fairly major players within the Australian Government and former Australian Department of Defence. Can you talk a little about the opinions of some of the others who are involved in your organization and why they think it’s a good idea to change the threshold for going to war and the captain’s call?
Paul Barratt: Well, I think we are unanimous in feeling that the responsibility for this order should rest with the Federal parliament and it ought to be debated and, fully thought through. One of the things that doesn’t happen when it’s just decided by Cabinet or by the Prime Minister is a proper analysis of the legality of going to war. What we would all like to see, before Parliament takes a decision, is that the Attorney General or Solicitor General tables a formal written opinion about the legality of this war. The best legal opinion we have about the Iraq war is that it was illegal. No one takes very seriously the reliance that we had on old UN security council resolutions that were passed for another purpose. Apart from in our movement, we’ve had people, such as the former Chief of Army, saying that this move ought to take place.
An Artificial Intelligence Arms Race?
John Rodsted: Can we shift the discussion a little towards the new arms race that is starting: the development of killer robots? Just the talk of killer robots sounds like a bad dream, but they are real, and governments worldwide are developing and investing in them. What do you understand these to be and how would they be deployed in the battlefield, or for that matter into urban environments?
Paul Barratt: I think the word robots conjures up, in the public mind, things that might move along the ground and maybe have arms and legs. However, what we are really talking about is any kind of lethal autonomous weapon, that very often would be a more advanced form of an armed drone. Such a device would have its own decision making capability and would take human agency out of the decision to launch a lethal strike.
Now it becomes a little bit fuzzy. I was reading this morning, someone from the US army talking about the progress they are making with lethal autonomous weapons and that they will never take human agency out of making the decision. However, the way these drones (weapons) work, is they have a collection of sensors that will bring a lot of data together and then make a recommendation. That recommendation would include which weapon would be the best to use for a particular purpose and where it was located. This US army spokesperson was talking about reducing the decision-making time from information coming from the sensors to someone pressing a button, from twenty minutes to twenty seconds. Twenty seconds doesn’t sound to me like a lot of time for someone to make a considered decision to launch a lethal attack. The word meaningful comes into it. You need meaningful human intervention, not just the fact that a human being is somewhere in this highly automated chain. The importance of humans having control is that some very important decisions need to be made: who is to be attacked; is this attack militarily necessary; is it proportionate to what has happened, or what you think is about to happen? I would have no faith at all in the ability of people to program an autonomous weapon to make those decisions without the potential for great risk and tragedy.
John Rodsted: I think you have hit on something very important there: reducing the response time from twenty minutes to twenty seconds would bring the decision down to an operator and would take it away from a Commander. It would take it away from someone in charge of a force and give it to someone sitting behind a console somewhere. It would also reduce the legal framework in the decision-making process. Would that be correct?
Paul Barratt: It gets harder and harder to say who is responsible under international law for the fact that innocent people get killed. I think it illustrates the difficulty, both with the delegation of authority, and also with the ability to discriminate in the identification of targets.
I remember a case, it was probably 10 years ago in Afghanistan where a group of Afghans from a remote village in an area under surveillance were killed by an armed drone operated by someone with a joystick in Tampa, Florida. This group of Afghans was coming from a remote village to the nearest town. They left before dawn for what was a long journey. There were four or five guys in the back of a utility and someone driving. Halfway through the journey, a young man in Tampa blew them all away with an armed drone. They were just innocent visitors. One was going to visit the local doctor and one was going to get a prescription filled at the pharmacy. The drone operator was asked:
“Why did you press a button?”
“Because I could tell they were terrorists.”
“How did you know they were terrorists?”
“Because when the sun came up, they all got their prayer mats out of the back of the utility and faced Mecca and prayed. So therefore I knew they were terrorists.”
There are two things about that. Firstly, even with the considered human intervention, the human being made a catastrophic error of judgment because he didn’t know enough about the local culture. Secondly, how would you program an autonomous weapon not to make that mistake? I just don’t believe that it can be done.
We have seen many tragedies in places like Iraq and Afghanistan. Once a wedding party was blown away because people started firing their rifles in the air. It is customary, once a couple is united in holy matrimony, for the shout “Let’s all fire our rifles in the air” and someone blows them away because they’re firing rifles. The old saying in the IT industry about ‘garbage in, garbage out’. What these drones do, autonomously, will very much depend on the knowledge and skill of the people who are programming them.
John Rodsted: I think that points to how they identify who is the so-called enemy on a battlefield? Because they cannot identify who the friendlies are. It is easy to put a marker on your own troops, so you could see your own layout of the battlefield. But all that does is say that everything else living there is the enemy. Civilians, combatants, livestock: I can’t see how they would be able to separate them.
Paul Barratt: Neither can I. What we all await is the Brereton report on Afghanistan. I think what you are seeing in Afghanistan is people who are weary after almost 20 years of fighting. There is a situation where people no longer know who the enemy is.
A farmer standing in his field may be a genuine farmer standing in his field. He may also have a rifle by his feet ready to shoot you; but, again, he may be an innocent person going about his normal business. You have to decide whether to kill him or leave him alone. I just cannot see that autonomous weapons are going to be an advantage in situations like this.
An Algorithm Mess
John Rodsted: There was an interview recently with Dr Lizzie Silver, an Artificial Intelligence developer. The one thing that she really pointed out was how messy and how incapable AI is, especially when AI starts competing against other AI. It can turn into an algorithm mess that comes up with no real functional solutions. Her point being, that by its very nature, unless you have a human ready to say: ‘Hang on, this is turning into nonsense’, the AI will go down a path where it’s always trying to achieve its goal, but that goal might not be achievable.
This brings us to the point of whether these things are hackable or not. What would be the consequence if somebody managed to hack into the system?
Paul Barratt: Well, it would be a brave person who would insist that they had created something that was not hackable. Recent history is full of information that has been released via hacks. We know that all of the world’s leading powers are looking at how to hack each other’s IT systems. All you can ever do is say that we can’t think of any way it can be hacked. It would be very complacent to say that you had produced something that was not hackable.
It seems that countries are starting to invest in the development of military AI technology: what it’s really going to do is start a new arms race. That would be expensive. And I could imagine a situation where governments are annually spending a lot of GDP buying upgrades and new weaponry. This would put a lot of stress on the Australian purse and leave less to spend on what should be the expenditure of the Australian Government: education, health, infrastructure and so on.
John Rodsted: Have you got some comments about how Austra- lia has become involved in arms races at our level?
Paul Barrett: I don’t think we’ve had a lot of experience of it. For most of the post-war period, our Defence Force operated at a higher technical level than our neighbours. That is not the case anymore. Whenever Australia puts an emphasis on self-reliant defence capability, we think we need to control the air and sea approaches to Australia. That puts us into an implicit arms race because as other countries’ capability to reach Australia increases, we have to do more to maintain control. I think we are now certainly in an air combat arms race. We committed ourselves almost twenty years ago to the joint strike fighter, the F 35. People now tell me that the Russian sourced equipment used by neighbouring countries is more capable than that. So we might be in an arms race anyway.
Picking Up The Tab
John Rodsted: With the Prime Minister, having the sole re- sponsibility at present to commit us to war, does that also put the sole responsibility on the cost of the war in the hands of the Prime Minister? Is it only that person who decides that we are going to spend a lot of our national revenue on going to a war, or does that get checked by the Parliament?
Paul Barratt: In practice, it puts it in the hands of the Prime Minister, because whilst the constitution provides that the Federal Government can’t spend any money that hasn’t been appropriated by the Parliament, it is difficult to envisage a situation in which the Prime Minister would commit us to combat and the Parliament would refuse to vote the money. Because that would leave the troops without resources. Once we are committed, the Parliament has to approve whatever funding executive government says it needs to sustain that combat.
John Rodsted: The military, by its very nature, is always in the business of acquiring weapons that are going to give it the advantage. If we went down the path of building an arsenal of lethal autonomous weapons do you think, the very fact that we had them, would mean our threshold to be combative would be less, and I’m not talking from the Prime Minister’s perspective. If you’re a commander in the field and you have artificial intelligence drones at your disposal, would that make your decision to engage be at a lower threshold or a higher threshold?
Paul Barratt: I think it would be a lower threshold. Once we have them in our inventory they would be used. It would be hard for anyone in the civilian space, even our political leaders to tell the Chief of the Defence Force, not to use weapons that in his military judgment, the troops need to achieve what they’ve been sent to achieve.
Regulation
John Rodsted: That brings us to the discussion about regulation and there are several benchmarks for proportionality in weapons. A couple off the top of my head: Poison gas, after WW1 when we saw what a nightmare it created for people who were gassed, the convention was created in1925; blinding laser weapons which had the ability to blind anybody on a battlefield. In 1992 that technology was beaten before it was ever deployed in war. Then there are the two pragmatic cases; the landmines treaty of 1997 and the cluster bombs treaty of 2008. We do have a history of looking back, or even looking forward, in the case of the blinding laser weapons, and choosing to either eliminate a functional weapon system or stop one before it was deployed.
To do this means having a Prime Minister, Ministers or decision-makers who do not just get seduced by the latest, greatest technology that is being offered. Today this would probably be autonomous weaponry.
Paul Barratt: What you say is true, but the dilemma facing a government would be if these are not outlawed internationally and other countries are getting them; are we forced to respond? The nuclear non-proliferation treaty was drawn up to avoid that kind of situation. As these things spread, other countries feel obliged to equip themselves with similar weapons as a deterrent. The best option that I can see for an Australian Government, is to campaign very vigorously for these weapons to be outlawed. That would, no doubt, cause friction with our allies in the United States, who by the way have not joined (ie) the cluster munitions treaty. The United States refuses to have anything outlawed as it applies to the United States. But nevertheless, I think we should be campaigning to have these weapons outlawed, and hence not equipping ourselves with them.
John Rodsted: In the case of the use of lethal autonomous weap- ons, what would you imagine some of the scenarios of a failed strike could look like? If someone deployed these where could that go wrong?
Paul Barratt: It could go wrong in almost any conceivable way. There is a possibility that the algorithm goes wrong. You attack people who ought not to be attacked, including friendly forces. There is also the chance of the weapon being hacked and turned back on Australian troops. Or it may just fail to complete the task. It is possible to proceed, on the assumption, that this complex and sophisticated piece of equipment will work perfectly. Imperfection could lead to all sorts of failures, including, damage to your own side.
John Rodsted: That raises the question of arms manufacturers who may love to have the manufacturing of lethal autonomous weapons legal because it will provide them with a continual stream of investment. Especially if every year the technology needs to be upgraded or replaced. From a share market per- spective and from a corporate perspective, that would be quite attractive. But it would not be terribly attractive on the ground.
Paul Barratt: No, and I don’t think either our national or the international approach to weapon systems ought to be driven by the interest of the arms manufacturers. I think we should put the national and public interest first and ensure the interest of arms manufacturers are subordinate to that.
Universities & AI
John Rodsted: You originally studied physics at university and universities are always looking to solve technological problems. Part of the greatness of universities is that they encourage brilliant young minds to solve problems and create a function out of the ether, really extraordinary stuff. Should the universities be looking at limiting what they do with lethal autonomous weapons or at least with the various platforms that would be employed in this technology?
Paul Barratt: I think so. We don’t expect our universities to be doing research on biological or chemical weapons, except possibly, for strictly defensive purposes. I can see a role for universities to examine how to defend your country against these things. People in the discipline of arms control in universities need to be thinking about how to establish an effective regime that outlaws such weapons. But, to have our universities go into developing these weapons or even some aspect of them, I think can be a very bad idea.
John Rodsted: There really are two spaces. One is about techno- logical development, producing and testing software and then working out the appropriate platforms. The other is the ethi- cal side. The ethical investment should override technological investment.
Paul Barratt: Yes, greater effort should be directed to the ethical side of this issue, not just the technical side. That should be only to the extent that we need to understand the technology in order to defend ourselves from it.
John Rodsted: Do you think AI weapons are a step too far, or there is a space somewhere within the defence landscape for them?
Paul Barratt: I think they are a step too far. When it comes to killing people, you need to have people, not only in theoretical control but effective control and accountable for the decisions they make.
John Rodsted: Trusting the Prime Minister in the past or the present or the future to make the right call going to war. Do you think they have had that power in the past or they would in the future? Is that a process that is trustworthy or should there be something else?
Paul Barratt: We have seen the Prime Minister of the day make the wrong decision in Vietnam, Afghanistan and Iraq. We also saw Tony Abbott decide to extend our operations in Northern Iraq, against ISIL, into Syria. He contemplated putting a battalion into Ukraine for God’s sake, to secure the site of the crashed aircraft. I don’t think you can rely on Prime Ministerial decision making at all. I should mention that Malcolm Fraser, while he was alive, was the patron of our organization. He argued that a Prime Minister would always get his way in Cabinet if there was something he really wanted. It is too easy for a small group like Cabinet to get so involved they do not think the issue right through. It can be: ‘We’ve had a busy morning and it’s lunchtime. Let’s make this decision and get out of here.’ Or simply listen to what the Prime Minister has to say and reply, ‘Yes, Prime Minister, that’s fine’. They do not really unpick it. No! I would not trust any Prime Minister to make the right call.
John really appreciated the perspectives from Paul about decisions to go to war and lethal autonomous weapons. Policy in these areas must be moved forward with greater regulation established. This interview is part of SafeGround’s Stay in Command podcast series available online.
If you would like to know more about Paul Barrett’s work with Australians for War Powers Reform, please visit their website. www. warpowersreform.org.au
Recent Comments