House debates
Tuesday, 2 July 2024
Bills
Criminal Code Amendment (Deepfake Sexual Material) Bill 2024; Second Reading
1:13 pm
Julian Hill (Bruce, Australian Labor Party) Share this | Link to this | Hansard source
I said when I finished last night that I would be back soon with a different tie—and so it has come to pass! I'll perhaps precis my remarks of last night, given that they were very short. You can do a lot in two minutes, but not do justice to this bill. This new criminal law will ban non-consensual sharing of deepfake pornography. It's necessary as a consequence of the time we live in and the acceleration of technology, particularly artificial intelligence technologies. This new law will make it a criminal offence to share deepfake porn of someone without their consent. It does seem bizarrely unnecessary that such a law is required, if any sense of common decency could be relied upon. But here we are.
Digitally created and altered sexually explicit material is a damaging form of abuse and is accelerating. It's most often used against women and girls but it can be weaponised against anyone. There's been an explosion and a proliferation of artificial intelligence and online technologies, which has made this bill urgent. Anyone who is reasonably digitally literate now, so I'm told—just to reassure you, Deputy Speaker Vasta, and the House, this is not something I have direct familiarity with, but we are told that anyone with a modicum of technical ability can now undertake these activities. It is a high-priority reform for the government and is part of a suite of reforms to tackle online harms, and I'm glad to see ongoing, increasingly urgent action.
For all the many benefits of technology—we can name them and talk about them at length—there are also, apparently, many negatives such as scams, deepfakes and the growing recognition of the mixed benefits of social media and the harms that are occurring. Sexually explicit deepfakes, shared without the consent of the people depicted, are being used to degrade and dehumanise other Australians—especially to target women, to perpetuate harmful gender stereotypes and, so the experts tell us, to drive gender based violence; hence action is needed.
There are a number of questions. I posted something on various social media platforms a few weeks ago when we announced this, and on the rare occasion you get a couple of minutes to have a look at what some of the—there are always a bunch of silly comments and conspiracy theories, such is the world we live in, but there are some genuine questions people who have an interest in this and are trying to work out what it means are posing. There is a set of questions about what deepfakes are, and it's worth recording this: deepfakes, as this bill provides, are images, videos or sound files, often generated using AI, of a real person that have been edited to create realistic but false depictions of them doing or saying something they did not actually say or do. Deepfakes can create, as we're seeing across the world, entirely new content or manipulate existing content using a large number of photos or recordings of a person. The risks of abuse are growing as the tools to create deepfakes become more accessible. That's the general definition of deepfakes. But this bill doesn't cover all deepfakes; it just covers sexually explicit deepfakes.
Another question, which people, perhaps peculiarly, with various different intents may ask and have asked, is: will this bill cover sexually explicit deepfakes of famous people or just of people they know? The answer to that is: it will cover both. The new offences cover where a person shares a sexually explicit deepfake of a real person either knowing the person has not consented or being reckless as to whether they have consented or not. This is regardless of who the real person is and whether—this is really important—they're personally known to the offender, to the person sharing this deepfake material.
In a number of comments I saw, people were asking, 'Will this cover electoral fraud or other kinds of scams?' The answer is no. This bill is particularly and solely focused on the nonconsensual sharing of sexually explicit deepfakes online. It will only apply if the deepfakes are sexually explicit. There is of course a set of broader concerns, and we're seeing them play out in various elections around the world now—and no doubt it will be an issue for us in the next set of elections we face in this country. It is a set of broader and legitimate concerns in our democracy about the use of deepfake technology. It's not just politicians; those who want to undermine trust in our institutions can deploy this technology across all spheres of our society. But this is not what this bill is focused on; it is focused on sexually explicit deepfakes that are doing so much harm, particularly to young women and girls.
There is then the question: does this bill cover the mere creation of deepfakes? The answer to that, which some may be a little confused by, is no. It's not actually the creation; it's the nonconsensual sharing that this bill criminalises. The offences in the bill apply to the sharing of nonconsensual deepfake sexually explicit material online. Where the person also created the deepfake that is shared without consent, there's an aggravated offence which carries a higher penalty of up to seven years imprisonment. So an aggravated offence is there; if you create and share without consent, there's a higher penalty. But, arguably, just the creation alone is not something that the Commonwealth can criminalise. It's important to ensure that the new offences are legally robust and that they have a sound basis in the constitutional powers of the federal parliament. The Commonwealth's power to legislate does not extend to the mere creation of sexually explicit adult deepfakes; presumably, that's something the states and territories would have to deal with. There are state and territory laws that may well apply to the creation of deepfakes in different ways in the different states and territories, but the Commonwealth does have the constitutional power to deal with sharing of non-consensual material over telecommunications networks and so on.
The Criminal Code currently criminalises the sharing of private sexual material online. But the definition of 'private sexual material' is potentially limiting and doesn't explicitly extend to artificially generated material, such as deepfakes—hence the legal grey zone, with regard to which this bill makes absolutely clear. The bill proposes to repeal the existing offence related to private sexual material and replace it with a new offence that applies where a person transmits the material using a carriage service—one of those traditional Commonwealth heads of power—and the material depicts: a person who is, or who appears to be, 18 years of age or older; a person engaging in a sexual pose or sexual activity; or sexual organs—or, if we're going into great detail, the anal region of a person or a female's breast, and the person knows that the person depicted does not consent to the transmission of the material, or is reckless as to whether the other person consents.
I touched on one of these items before, but the bill introduces two aggravated offences which apply increased penalties to the offence in two circumstances. Firstly: before the commission of the underlying offence, three or more civil penalty orders were made against the person for contraventions of relevant provisions of the Online Safety Act 2021 and, secondly, where the person was responsible for the creation or alteration of the sexual material transmitted. The government argues, and I hope that every member of the House agrees, that these amendments are essential to ensure that offences can apply both to real material, such as unaltered images and recordings, and also to fake or doctored material that has been created or altered using technology such as deepfakes. The new offences, just to be clear again, will not cover private communications between consenting adults or interfere with private sexual relationships. We don't seek to criminalise consenting human sexuality with this; it's the non-consensual sharing of deepfake material that's of great concern to the Australian people and to the parliament—and particularly to parents, who are grappling with the impacts of this technology and the use, abuse and misuse of this in relation to teenagers. The offences will only apply to material depicting, or appearing to depict, adults. The Criminal Code continues to criminalise the use of carriage services for child abuse material, including child abuse material generated by AI. But of course there are various definition issues, so this complements the existing laws in relation to child abuse material.
I'm very happy to keep talking but I might check at this point, given we have six minutes to go, if we have another speaker? We do. Great—I always love hearing from the member for Forrest!
1:24 pm
Nola Marino (Forrest, Liberal Party, Shadow Assistant Minister for Education) Share this | Link to this | Hansard source
As has been said, the coalition supports the intentions behind the Criminal Code Amendment (Deepfake Sexual Material) Bill 2024, as would, I think, every member of the House as we deal with some of the multitude of risks that we see online.
There is no doubt that there will be serious and damaging consequences to people through AI generated sexual material; it knows almost no bounds in this space. There is a proliferation of artificial intelligence that creates just extraordinary possibilities. It is, in some ways, so useful, but it comes with enormous risks, particularly when we're talking about young people. The risk we are dealing with here arises from that intersection of AI with its tremendous capability to generate material that actually appears to be real. But this will bring an extraordinary capacity to inflict real harm by distributing sexual material online.
I've delivered hundreds of cybersafety presentations to young school children, and teenagers as well. I've listened to what they've already had to deal with, with their faces—without the AI piece, just their faces—being replaced on other bodies and used in a sexual way. The harm that has caused them and the distress it has brought to them and their families was done even without AI. So in this space, the non-consensual sharing of deepfakes and sexual content is extraordinarily dangerous in my view. It's dangerous to the individual, and dangerous more broadly.
We know that this bill will replace some of the existing laws. We certainly took these laws very seriously when in government. Technology has constantly changed and will keep changing. There was a real need for our Enhancing Online Safety (Non-consensual Sharing of Intimate Images ) Act in 2018. We really needed that very effective civil take-down capability. One of the most common things for parents who ring my office—because they know of my work in this space—is that they want material removed very quickly. That's the first thing: they don't want it shared, they don't want it to go viral and they're panicking. Often, if it's a young child, that's their greatest worry—and the risk of that child being bullied as a result of that content being online, or being out there when they don't know about it.
We've seen a lot of work in this space previously, but I am concerned about the current bill and the removal of the current definition of 'consent'. Currently, the law says explicitly that consent means 'free and voluntary agreement'. But government has actually removed that definition. There are just a few issues with that—given the real focus on the need to keep people safe and what will happen in the AI generated world. How will the courts adjudicate the consent rule if it's not clearly defined in the legislation? That's what we're going to need, and that's what those who seek to use these laws will need: very clear definitions. The courts will need those as well.
The coalition have also announced a real plan to lift the minimum age for kids accessing social media from 13 to16. It is a top priority for us and it's very important that this is the case. Members have recently heard Meta giving evidence at hearings, where there was a claim made that there's no damage being done to young people on their platforms or in this social media space. Well, from my years of doing this, let me tell you that I have seen a significant deterioration in the mental and emotional health and wellbeing of young people because of what's happening online and on social media. So I don't swallow that for a minute. Every one of those social media platforms creates a harmful place for young people—even for adults as well—and certainly could in the deepfake non-consensual sharing of sexual images. Ever since the inception of online social media platforms, the creators and executives in charge of these platforms decided—apparently, because of what we've seen since—that it was perfectly fine to expose our very young children to what I see as, and it really is, a free-for-all in an online paedophiles' paradise. That's what this is. That's what they have done. Who would have thought there'd be a platform that allowed our children to be groomed online by sexual predators and exposed to extreme and violent pornography—because that's what's happening—and to be exposed randomly to billions of people of all ages on the Internet? But that's what the platforms allow. It is part of these platforms' business model and is what is available to very vulnerable young people—
Sharon Claydon (Newcastle, Australian Labor Party) Share this | Link to this | Hansard source
The debate is interrupted in accordance with standing order 43. The debate may be resumed at a later hour, and the member will have leave to continue speaking when the debate is resumed.