Senate debates

Monday, 19 August 2024

Bills

Criminal Code Amendment (Deepfake Sexual Material) Bill 2024; Second Reading

7:26 pm

Photo of Nita GreenNita Green (Queensland, Australian Labor Party) Share this | Hansard source

I'm really pleased to rise to speak on the Criminal Code Amendment (Deepfake Sexual Material) Bill 2024. I chaired the inquiry into this bill. I want to thank all the witnesses who came forward during the inquiry's hearing. This is a crucial piece of legislation that is going to change and protect lives. I'm going to talk about the bill and the evidence that we received throughout the hearing. I will come back to the previous contributions because I think it is worth responding to claims from those opposite that this offence and this Criminal Code amendment should be watered down in any way or that, in some sort of way, per the Greens' position that they've put forward, we should ignore the Constitution. I seek to come back to those issues.

The rise of deepfake AI represents a growing problem that we can no longer ignore. Our technology is rapidly advancing, and it is our responsibility to ensure that our legal frameworks evolve with new crimes of the digital age. The bill before us will pave the way for our stance on technology abuse within Australia by imposing serious criminal penalties on those who share sexually explicit material without consent. These reforms will include material which is created using artificial intelligence or other technology. This bill could not arrive at a more crucial time. The eSafety Commissioner provided evidence to our committee that sexually explicit material has increased 550 per cent every year since 2019. Approximately 90 per cent of deepfake images are sexually explicit and 95 per cent of them are targeted material of women and girls. It is prolific, it's horrific, and it destroys lives.

The public hearings on this bill revealed that, in October 2022, research shows that a new AI bot created and shared 680,000 fake nude pictures of women without their knowledge. These statistics expose a distressing truth for our country, that deepfake and AI sexually explicit content is being weaponised against women. It is rooted in misogyny, and its presence is only going to expand as technology improves. Digitally created and altered sexually explicit material is a deeply damaging form of abuse used to degrade and humiliate its victims. It has the potential to cause life-lasting harm, with this fabricated explicit material often indistinguishable from real life. It is also predicted that deepfakes will only become more advanced in the future, with technology being able to swap people's heads from their bodies and voices in photos and videos and images that are received on phones. This material impacts its victims in all areas of their lives, with survivors left traumatised often personally, professionally and emotionally. This is why the Albanese Labor government is strengthening the Criminal Code, imposing a penalty of six years imprisonment for those who share non-consensual deepfake sexually explicit material and an aggravated offence and a higher penalty of seven years for those who created the non-consensual deepfake. This new law is an extension of our government's commitment to ending family, domestic and sexual violence within a generation. We are keen to take these steps to protect women and girls, given that they exist as 99 per cent of victims of this harmful material.

As chair of the committee, I was privileged to hear from experts, stakeholders and victims on the impact that this type of material has on our community. It was clear from the very beginning of the hearing that this type of material is pervasive and unprecedented in its nature. The creation and sharing of non-consensual deepfakes has reached a phenomenal scale in terms of how many people can be targeted, how easy it is to target individuals and the sheer number of websites and apps that are dedicated to this type of abuse. Ms Noelle Martin, a global activist and Young Western Australian of the Year in 2019 spoke to these issues. Not only did she emphasise that billions of people have viewed the top 30 non-consensual deepfake websites; she also explained that this is not just an issue that affects celebrities and people in the public eye. This is an issue that continues to target everyday women and girls.

With more and more apps emerging that allow people to upload photos of women and churn out sexually explicit material, we are seeing lives destroyed overnight. Websites and apps of this nature are causing women to become bullied in their school grounds and their workplaces. It is destroying their ability to find work and compromising their employment. This strain of abuse threatens people's ability to control their reputation and uphold their right to dignity, autonomy and agency. As one can imagine, those who experience this type of abuse, of digital material being non-consensually created and distributed of themselves, often witness an extreme decline in their physical and mental health. Everywhere these victims go they are worried. They are worried that their family members, grandparents, coworkers and friends have seen the non-consensual material, and some are unable to make eye contact with others as they think, 'Have they seen it too?' Our hearing revealed stories of individuals who would have the images replay over and over in their nightmares, stories of women experiencing an all-consuming feeling of dread or afraid to leave their homes due to the overwhelming feeling of shame and fear.

We know that gender based violence is all too common within Australia, with one in five women having experienced sexual violence since the age of 15. It manifests in various forms, whether it be physical, emotional, economic or social. But, with AI available to everyone with the click of a button, the scope, severity and type of violence against women is only going to grow if we do not act. If our country wants to address the culture of gendered violence, then it needs to be made clear that women and their bodies are not property to be non-consensually manipulated, distributed and ridiculed.

Now, of course, this work does build on the work our government is doing around making sure that we are improving our laws around consent and ensuring that we have the proper type of education and training for young people about what consent means. The Albanese government is providing $76 million to states and territories and the non-government school sector to deliver age-appropriate, expert developed respectful relationships programs, and this funding goes hand in hand with types of legislation like this. As we will see, this program ensures that we are facilitating healthy interactions between girls and boys. This action is also complemented by the Stop it at the Start campaign, which has been designed to combat misogynistic attitudes and influences online.

But I want to go to a very important point. It was raised by Senator Cash in her contribution. It is this notion and idea from those opposite, which has been said many times—I think the shadow minister said it today, and we've seen other shadow ministers say it in the media—that we don't need this bill. It is clear that anyone who would say that today didn't listen to the victims who were at the hearing and hasn't been listening to the women who have been coming forward and talking about this for years. Those women said at the hearing that they spoke to the previous government and weren't listened to. Now they are being listened to, and this type of material is being criminalised.

Failing to amend the criminal framework to address new types of technology poses a real risk to our future. It also just ignores what the victims have been telling us. Those at the deepfake public hearing warned against a future without a bill such as this. Professor Asher Flynn said:

There is a real risk, without taking serious steps like criminalising deepfake abuse, that we will see backwards efforts in our attempts to address and prevent sexual harassment and abuse.

That's what I want those opposite to consider before they walk in here and talk through Senator Cash's talking points on this bill, because the evidence we received at the hearing does not support the position that Senator Cash put forward. This deepfake bill, we know, will ensure that the laws send a very clear message that this type of behaviour is not tolerated by our country, by our parliament, and that we recognise the harm of deepfake abuse within our country. That's what this bill does.

This bill also recognises that we want to do as much as we can to create safe online environments for women and young people. It recognises that, as a government, we reject the idea that it is acceptable to dehumanise and degrade others by using technology. Saying that this bill is not needed is insulting to the victims who came forward and it doesn't recognise the very real propositions put forward by the CDPP about the lack of legal protection, for women, in this case, when it comes to crimes like this. It's important that we set the standard; it's important that we give victims the legal right to pursue justice; and it's important that we don't mislead the Senate by suggesting that this type of bill or this type of law is not needed. I want to be very clear about that.

The Albanese Labor government is committed to making sure that our communities are safe for our current generations and the next generations of women—and that is what this bill is. That's what protecting women looks like: this bill. It looks like delivering respectful relationships and consent programs across the country, to ensure that we're fostering healthy in-person and online relationships. It looks like criminalising deepfake abuse, to send a clear message that dehumanising and degrading behaviour is unacceptable in our nation. It looks like making sure that men and women feel protected in their schools, their offices, their homes or in online spaces, in their experience—particularly if they experience deepfake sexual violence. This is what it looks like to protect women and girls. It's taking action like this and bringing forward legislation that will protect them.

I know that women are watching what this parliament does on this bill, because they are very fearful, if they have been the victims of this abuse, that they won't have legal justice. They understand that the problem is being propelled by the technology that is out there. So this is a crucial time to take action on this type of crime. The future of technology has a lot of unknowns—I think we all recognise that. But what we do know is that the use of deepfake and AI for the creation and sharing of sexual material is only going to become worse if we don't act now.

I really commend the Attorney-General for bringing forward this important piece of legislation. I also want to thank all of those who spoke at the public hearing on this bill. It really was clear from the experts, victims and members of the public who spoke that strengthening the law against deepfake sexually explicit material is an excellent step towards addressing gendered violence within our country. Those experts and victims and members of the public who spoke out during that hearing, and all of the people who've called for this type of legislation, are listening to this debate, and what they don't want to hear is some sort of word salad from those opposite about how this bill needs to be watered down. What they want to see is a parliament and a government who are willing to take action and step up and protect women and girls from what is incredibly corrosive behaviour and to set a community standard about what this parliament thinks is acceptable. That's what we're talking about today.

I really hope that the Senate sees through the attempts from those opposite to weaken this legislation, and I really hope the Senate sees that this is an urgent piece of legislation—that there are women and girls out there who need this protection right now, who will get it when we amend this legislation. We heard evidence that the legislation we have right now is not enough. It doesn't go far enough. It doesn't cover the field. That is quite clearly evident from the evidence that we received. I know that those women who are listening want to see this action taken because they understand the real threat that AI technology presents to them.

Finally, I want to say thank you not only to the Attorney-General but also to members of our government for bringing this legislation forward as quickly as possible. The evidence received at the committee inquiry showed us how urgent it is, how important it is and the difference that it will make to protecting women and girls, which this government—members on this side of the Senate—are committed to.

Comments

No comments