Senate debates

Monday, 19 August 2024

Bills

Criminal Code Amendment (Deepfake Sexual Material) Bill 2024; Second Reading

7:16 pm

Photo of Larissa WatersLarissa Waters (Queensland, Australian Greens) Share this | Hansard source

I rise to speak on the Criminal Code Amendment (Deepfake Sexual Material) Bill 2024. The Greens welcome this bill, which responds to the online harm caused by deepfakes and other artificially generated sexual material. However, it does nothing to stop the creation of those images in the first place. It's well and good for legislation to be updated to make sharing deepfake or AI image based abuse a crime, but it would be better that the images weren't so easy for anyone with internet access to create and then share. Dr Rachael Burgin of RASARA, the Rape and Sexual Assault Research and Advocacy organisation, said during the Legal and Constitutional Affairs Legislation Committee inquiry into the bill that the creation of deepfake images:

… whether or not they are distributed, is a significant harm and, commonly, threats to circulate … deepfakes, is a tactic used by an abuser to instil fear and exert control and power over another person.

A failure to criminalise the creation and the threat of the creation of this type of material accepts a culture of gendered violence.

This bill doesn't come close to addressing how image based abuse is contributing to gendered violence. Image based abuse is deeply gendered. Women, girls and gender diverse people are most commonly targeted and, much like revenge porn, it is overwhelmingly used by men as a tool to harass and degrade. In fact, 99 per cent of victims of deepfake pornography are women, according to a 2023 study by Security Hero. Deepfake material may be fake, but the impacts are very, very real. We heard heartbreaking evidence during the inquiry into this bill about the harm of deepfake abuse on individuals. Noelle Martin, a victim-survivor of deepfake abuse, said:

… it is life-destroying and shattering. It is something that impacts every aspect of a person's life. It impacts their being able to find work and their employability to being able to manage and control their reputation. It impacts a person's dignity, autonomy and agency. It compromises and threatens people's capacity to self-determine, to flourish and to exercise their sexual agency.

I completely agree with her remarks. The federal government can and must do more to stop the harm of deepfake abuse by stopping their creation.

During the inquiry into this bill, the federal government said that the Commonwealth can't stop the creation of deepfake material and that it's the states and territories who would need to do any criminalising under their own state based criminal codes. I note that they mentioned that it was on the Standing Council of Attorneys-General agenda, which I welcome, but they insisted that the Commonwealth could not act. I beg to differ. The Convention on the Elimination of All Forms of Discrimination against Women is an international treaty that Australia has ratified. In my view, as a lawyer—and I'm sure many minds will have views on this—there's a clear case that the federal government could use this external affairs power, based on the fact that we ratified that treaty, to enact laws that stop the creation, as well as the associated threat of creation, of deepfake sexual material. So I'd like to foreshadow a second reading amendment in my name to that effect, and I'll move that properly towards the end of my contribution.

It's pretty clear, in my mind, that the Commonwealth could legislate to stop the creation of this deepfake abuse if they so chose. I think it's unfortunate that they waved away the suggestion that there's no constitutional barrier to them doing so and, instead, that it's up to somebody else to tackle what is actually the real problem, and they're leaving that to the states and territories. I hope the states and territories do prioritise that work and do come on board and attend to fixing the state and territory based criminal codes in that respect. Let's hope they take a harmonious approach; otherwise, you'll have a patchwork of approaches where you could have had the Commonwealth simply step in to criminalise the creation of that material as well as just the mere dissemination of it. Let's stop playing catch-up with technology and, instead, show leadership in preventing this harm from occurring in the first place.

With all of that said, we know that criminalisation is not the sole way to prevent image based abuse from occurring. Education and primary prevention are much needed, and they're definitely needed to drive cultural change in this regard. That's why the Greens are calling on the Albanese government to include education on the harms of deepfake sexual material and on the weaponisation of artificial intelligence in consent education and respectful relationships education as they're currently rolled out, albeit patchily and albeit without enough funding, in our secondary school system.

It's also imperative that frontline service providers who specialise in both preventing and responding to this kind of abuse receive the funding that they need. These crimes can have a number of victims, and they can impact a number of victims-survivors. Ultimately, those people will seek assistance and support through specialised sexual violence services, which, at the moment, are underfunded and are having to turn people away or condemning them to obscenely long waitlists.

Sexual violence support services are even more underfunded than family and domestic violence frontline support services. The wait times for all waiting lists for support after experiencing sexual violence, for counselling, for justice and for healing are unacceptable and will compound the harm that is done. We need those frontline sexual violence service providers and support organisations fully funded so that you don't have waitlists that are months and months long for people to start that process of healing.

Explicit deepfakes are becoming prolific, but, with widespread and easy access to artificial intelligence apps in every single smartphone, I might note that deepfakes are already wreaking havoc on political communications and on election outcomes. Deepfakes present inherent risks, including the spread of misinformation and threats to privacy and security, altering election outcomes and misleading and deceiving voters. I note that my colleague Senator Shoebridge will have a second reading amendment to address that issue, and he also has a private senator's bill. I urge the Albanese government to take immediate action to extensively ban the creation, as well as the transmission, of deepfakes to address the harm that deepfakes present to our democracy and to address our culture of gendered violence.

Just to recap the existing rules, despite the rather lengthy contribution from Senator Cash, it's pretty clear to those of us that sat on the committee that deepfakes generated by AI were not covered by existing laws, and it's important that they be covered. This bill would do that, but it would only address the distribution of deepfakes. It wouldn't address the creation. It actually needs to stop them from being created in the first place. We took some incredibly disturbing evidence that some of these apps—Nudify was one example that was often referred to—are not only freely available for download on any smartphone but have been categorised as acceptable for age 4-plus. My jaw hit the floor when I heard that evidence. There is something deeply wrong when the Commonwealth says they can't fix that. I don't accept that they can't fix it. I wish that this bill were going to fix the problem. I think it's a good step forward, but we need to actually take that next step and make sure that those apps can't run riot and provide material that is so deeply unhelpful and in fact so traumatising and can so easily be turned into a tool of violence predominantly against women. We need the government to do more in this space. They have the power to legislate, and they should. The videos might be fake, but the harm is real.

As I said earlier, 99 per cent of victims are women. Overwhelmingly, this sort of abuse is used by men as a tool of violence, as a tool of power and as a tool of control. We desperately need that education and that primary prevention. That is not just a change to our criminal laws; we need to make sure that a culture change and a message are sent particularly to young people that this is not the appropriate way to be relating to your peers or to other people that you might be subjecting to this AI generated abuse.

With that said, I note that it's time for this government to put the rights of women and girls ahead of the profits of big tech. I move:

At the end of the motion, add ", but the Senate notes that Australia, having ratified the Convention on the Elimination of All Forms of Discrimination against Women, has an international obligation to make the creation of, and associated threat of the creation of, deepfake sexual material an offence."

Comments

No comments