House debates

Tuesday, 2 July 2024

Bills

Criminal Code Amendment (Deepfake Sexual Material) Bill 2024; Second Reading

1:24 pm

Photo of Nola MarinoNola Marino (Forrest, Liberal Party, Shadow Assistant Minister for Education) Share this | Hansard source

As has been said, the coalition supports the intentions behind the Criminal Code Amendment (Deepfake Sexual Material) Bill 2024, as would, I think, every member of the House as we deal with some of the multitude of risks that we see online.

There is no doubt that there will be serious and damaging consequences to people through AI generated sexual material; it knows almost no bounds in this space. There is a proliferation of artificial intelligence that creates just extraordinary possibilities. It is, in some ways, so useful, but it comes with enormous risks, particularly when we're talking about young people. The risk we are dealing with here arises from that intersection of AI with its tremendous capability to generate material that actually appears to be real. But this will bring an extraordinary capacity to inflict real harm by distributing sexual material online.

I've delivered hundreds of cybersafety presentations to young school children, and teenagers as well. I've listened to what they've already had to deal with, with their faces—without the AI piece, just their faces—being replaced on other bodies and used in a sexual way. The harm that has caused them and the distress it has brought to them and their families was done even without AI. So in this space, the non-consensual sharing of deepfakes and sexual content is extraordinarily dangerous in my view. It's dangerous to the individual, and dangerous more broadly.

We know that this bill will replace some of the existing laws. We certainly took these laws very seriously when in government. Technology has constantly changed and will keep changing. There was a real need for our Enhancing Online Safety (Non-consensual Sharing of Intimate Images ) Act in 2018. We really needed that very effective civil take-down capability. One of the most common things for parents who ring my office—because they know of my work in this space—is that they want material removed very quickly. That's the first thing: they don't want it shared, they don't want it to go viral and they're panicking. Often, if it's a young child, that's their greatest worry—and the risk of that child being bullied as a result of that content being online, or being out there when they don't know about it.

We've seen a lot of work in this space previously, but I am concerned about the current bill and the removal of the current definition of 'consent'. Currently, the law says explicitly that consent means 'free and voluntary agreement'. But government has actually removed that definition. There are just a few issues with that—given the real focus on the need to keep people safe and what will happen in the AI generated world. How will the courts adjudicate the consent rule if it's not clearly defined in the legislation? That's what we're going to need, and that's what those who seek to use these laws will need: very clear definitions. The courts will need those as well.

The coalition have also announced a real plan to lift the minimum age for kids accessing social media from 13 to16. It is a top priority for us and it's very important that this is the case. Members have recently heard Meta giving evidence at hearings, where there was a claim made that there's no damage being done to young people on their platforms or in this social media space. Well, from my years of doing this, let me tell you that I have seen a significant deterioration in the mental and emotional health and wellbeing of young people because of what's happening online and on social media. So I don't swallow that for a minute. Every one of those social media platforms creates a harmful place for young people—even for adults as well—and certainly could in the deepfake non-consensual sharing of sexual images. Ever since the inception of online social media platforms, the creators and executives in charge of these platforms decided—apparently, because of what we've seen since—that it was perfectly fine to expose our very young children to what I see as, and it really is, a free-for-all in an online paedophiles' paradise. That's what this is. That's what they have done. Who would have thought there'd be a platform that allowed our children to be groomed online by sexual predators and exposed to extreme and violent pornography—because that's what's happening—and to be exposed randomly to billions of people of all ages on the Internet? But that's what the platforms allow. It is part of these platforms' business model and is what is available to very vulnerable young people—

Comments

No comments