SafeGround’s Pacific team have been exploring instances of artificial intelligence (AI) used for good across Pacific Island nations and found some interesting and downright amazing stuff!
FishFace – Facial Recognition
In 2016-17 the Nature Conservancy, along with fishermen in Palau such as Kalei Luii from the Division of Oceanic Fisheries Management, began using cameras and GPS systems to document different types of fish, to combat overfishing that was affecting marine ecosystems and Pacific fishing communities.
The process initially meant that a human being had to go through the footage, taking hours and hours to identify fish as they went by on the screen. Why should a human use up so much of their time on a task that we can teach AI to do?
That’s what the Nature Conservancy thought when they launched a competition with a start-up called Kaggle, in which “a wide network of data scientists who partake in machine learning competitions” used pre-existing data to come up with a system, similar to facial recognition software, that can detect and document different types of fish and log their numbers. Anne Kwok (2019) highlights in this article titled ‘AI empowers conservation biology’ that “For underfunded conservation scientists, AI provides an attractive alternative to manually processing huge troves of data, such as camera-trap images or audio recordings.”
While this technology is really exciting in terms of what it can do for the local environment and fish stocks, facial recognition software can come with a whole host of problems. For example, their algorithms are subject to racial and gender bias, which means that vulnerable people are disproportionately affected.
Problems in this area could easily arise amongst Pacific communities, as this type of AI “favours light-skinned, outwardly masculine faces over dark-skinned, outwardly feminine faces” (Stop Killer Robots, 2021). This matters in warfare, because international humanitarian laws require soldiers to differentiate between civilians and soldiers. AI with racial bias may cause autonomous weapons to be unable to classify people with certain facial features and darker skin, meaning they could target the wrong people, and as a result make an unlawful kill.
SafeGround is calling out to Pacific Islanders to share lived experiences and perspectives on AI, open space for discussion of killer robots, and emphasise the need for a treaty to ban them. We share stories of how AI is beneficial for Pacific communities, and at the same time build awareness of possible future implications such as the dual use of AI in autonomous weapons.
Recent Comments