Universities, Autonomy and Defence
KIller Robot Discussion for Adelaide-based students hosted at ThincLab, Univerity of Adelaide
Fully Autonomous Weapons: What are they?
Autonomous weapons technologies which integrate artificial Intelligence (AI), are currently playing a vital and increasing role in military systems. They are emerging as pivotal technologies of future warfare. Alarmingly, we are on a path headed towards fully autonomous weapons, also called lethal autonomous weapons systems or killer robots. These lethal robots can operate on land, air or at sea with the ability to select and engage targets using only their algorithms independent of any meaningful human control. The development of killer robots would be destabilising to security but also society’s humanity.
Intrigued? check out our homepage and the global website for more information on these deadly weapons.
Universities are involved with research and development with defence on autonomous systems; there is an urgent need for university students and their institutions to distinguish between the acceptable and unacceptable uses of AI and ensure they do not contribute to the development of fully autonomous weapons. Below you’ll find info on university involvement, our SURVEY for students and ways you can TAKE ACTION. Please share this webpage and our survey amoung your student friends and networks on social media to help us out! and follow @ban_kr_aus (twitter and insta) + Campaign to Stop Killer Robots – Australia on facebook.
How are universities involved?
Universities have been essential in shaping society through training our future generations. The knowledge and information they produce is crucial in driving innovation. The innovations we use as part of everyday life such as seat belts and touch screens all come from university research. However, there are universities that are driving the developments of autonomous systems through partnership and collaboration with the Australian defence. Without clear policies, your university could be contributing to killer robots.
Here are some universities with defence research in risk areas such as autonomy, robotics, and relevant tech components. Browse and explore to your hearts content with this map we created using Lil Sis. You’ll discover many links between some universities in defence with project areas of autonomy and AI
With innovation comes responsibility – your university must ensure it does not contribute to harmful technological developments.
You have the power for a positive impact.
Not all collaboration between Australian defence and universities is problematic. There have been developments of emerging technologies from these partnerships which are not controversial, such as autonomous take-off and landing.
However, it is crucial for your institutions and yourselves to be aware of how the technology you could develop in your degree could be used in the future.
Join us to stop killer robots by taking 2 mins to fill out this survey and you can be in the running to win a killer robots fighting pack! Draw announced Mon 31st August on social media and winner contacted via email
What can your university do?
We’d love to see universities commit publicly to not contribute to the development of fully autonomous weapons
Ideally, universities would establish clear policies and regulations that state the university will not allow its research, including by its staff or students, to aid the development of killer robots.
To prevent collaborating on their development, universities could establish guidelines to assess military use of research or standing committees to evaluate projects. Whilst we acknowledge there are many institutional levels, Japanese universities, for instance, have successful screening procedures.
We need to start urging our universities to work towards a stance, and it must be driven by students.
How is this relevant to you?
This is where you come in. This new cutting-edge research done in degrees such as computer science and IT could be used for the development of these abhorrent weapons. Thousands of AI researchers, computer scientists, developers and others in tech have signed a Pledge opposing these weapons, including Elon Musk, Steve Wozniak and the late Stephen Hawkings.
But in universities, students are the voice that matters. You have to take a stand against the development or potential contribution to killer robots by your uni. Think how your research is used, or may be in the future? Awareness of this issue is vital amongst academics and students where defence partnerships and how developments are used aren’t always transparent.
Institutions have a role to play in preventing fully autonomous weapons. It’s important to always be aware of what you are working on and the possible end uses. Urge your university to take a stand against killer robots, and play no part in their development.
What can you do?
While it may seem like making such changes will be too challenging, here are a few small but positive changes you can initiate at your university:
Write to your university regarding your concerns on their contribution to the development of fully autonomous weapons
Ask your program coordinator about the lack of technology ethics courses in Computer Science and IT degrees, push for the need for it to become a prerequisite and/or elective.
Host an event to open dialogue and allow for all student, staff participation within this topic
Check out our podcast…
featuring uni students fro different degrees and areas of study talk about their views on all this, taking action, their concerns and views on university involvement.
You must be logged in to post a comment.