House debates
Tuesday, 23 November 2021
Constituency Statements
SafeGround
4:24 pm
Maria Vamvakinou (Calwell, Australian Labor Party) Share this | Hansard source
Some time ago I had the opportunity to meet with SafeGround, who are an Australian NGO that work to minimise the impact of weapons of war. They came here to parliament, which is when I spoke with them, to discuss with us the issue of lethal autonomous weapon systems in the lead-up to the global diplomatic discussions that will reach a critical juncture next month at the United Nations in Geneva.
Without doubt, artificial intelligence technology has great potential to bring benefits to society across a range of applications. These technological advances, however, must be developed and implemented with appropriate oversight and regulation. The global proliferation of increasingly autonomous weapons is occurring amidst the absence of new international regulations that address the challenges caused by these emerging weapons systems. Rapid advancements in technological systems compel us to not only re-engage with, but to also advance fundamental notions of human rights well into the future. You might call this future proofing of our human rights. The fundamental and underlying principle is this: international humanitarian law must continue to apply fully to all weapons systems. Under this principle, like-minded countries have raised concerns about the ethical, legal, operational and technological issues associated with lethal autonomous weapon systems, particularly where human agency is removed from decision-making on issues of life and death.
Germany and France have highlighted the indispensable need to maintain meaningful human control over new weapons technologies. Closer to home, New Zealand has emphasised the legal, ethical and human rights challenges posed by the development and use of lethal autonomous weapon systems. In our own region, India has, in some respects, expanded on the issue of meaningful human control by cautioning against the potential for states to legitimise these weapons systems with broad references to what the term 'meaningful human control' actually means. This is particularly important because such an awareness goes to the heart of the issue of accountability not only in terms of consequences, but also the ability of such weapons systems to reduce the social and political thresholds with respect to decisions of whether to engage in war.
I know some of my colleagues are equally concerned about this issue, especially the need for Australia to support meaningful and legally binding international instruments with respect to lethal autonomous weapon systems. This includes the need for Australia to support the establishment of legally binding rules regarding specific prohibitions and limits on autonomous weapons, as well as support for the implantation of a normative and operational framework in the area of autonomous weapons within the instruments of the United Nations. Australia should prepare and articulate a national policy that strictly regulates and defines human control and responsibility with respect to such weapons. I want to thank SafeGround for the incredible advocacy and work that they do in the national interest.
No comments