House debates
Tuesday, 2 July 2024
Bills
Criminal Code Amendment (Deepfake Sexual Material) Bill 2024; Second Reading
1:13 pm
Julian Hill (Bruce, Australian Labor Party) Share this | Hansard source
I said when I finished last night that I would be back soon with a different tie—and so it has come to pass! I'll perhaps precis my remarks of last night, given that they were very short. You can do a lot in two minutes, but not do justice to this bill. This new criminal law will ban non-consensual sharing of deepfake pornography. It's necessary as a consequence of the time we live in and the acceleration of technology, particularly artificial intelligence technologies. This new law will make it a criminal offence to share deepfake porn of someone without their consent. It does seem bizarrely unnecessary that such a law is required, if any sense of common decency could be relied upon. But here we are.
Digitally created and altered sexually explicit material is a damaging form of abuse and is accelerating. It's most often used against women and girls but it can be weaponised against anyone. There's been an explosion and a proliferation of artificial intelligence and online technologies, which has made this bill urgent. Anyone who is reasonably digitally literate now, so I'm told—just to reassure you, Deputy Speaker Vasta, and the House, this is not something I have direct familiarity with, but we are told that anyone with a modicum of technical ability can now undertake these activities. It is a high-priority reform for the government and is part of a suite of reforms to tackle online harms, and I'm glad to see ongoing, increasingly urgent action.
For all the many benefits of technology—we can name them and talk about them at length—there are also, apparently, many negatives such as scams, deepfakes and the growing recognition of the mixed benefits of social media and the harms that are occurring. Sexually explicit deepfakes, shared without the consent of the people depicted, are being used to degrade and dehumanise other Australians—especially to target women, to perpetuate harmful gender stereotypes and, so the experts tell us, to drive gender based violence; hence action is needed.
There are a number of questions. I posted something on various social media platforms a few weeks ago when we announced this, and on the rare occasion you get a couple of minutes to have a look at what some of the—there are always a bunch of silly comments and conspiracy theories, such is the world we live in, but there are some genuine questions people who have an interest in this and are trying to work out what it means are posing. There is a set of questions about what deepfakes are, and it's worth recording this: deepfakes, as this bill provides, are images, videos or sound files, often generated using AI, of a real person that have been edited to create realistic but false depictions of them doing or saying something they did not actually say or do. Deepfakes can create, as we're seeing across the world, entirely new content or manipulate existing content using a large number of photos or recordings of a person. The risks of abuse are growing as the tools to create deepfakes become more accessible. That's the general definition of deepfakes. But this bill doesn't cover all deepfakes; it just covers sexually explicit deepfakes.
Another question, which people, perhaps peculiarly, with various different intents may ask and have asked, is: will this bill cover sexually explicit deepfakes of famous people or just of people they know? The answer to that is: it will cover both. The new offences cover where a person shares a sexually explicit deepfake of a real person either knowing the person has not consented or being reckless as to whether they have consented or not. This is regardless of who the real person is and whether—this is really important—they're personally known to the offender, to the person sharing this deepfake material.
In a number of comments I saw, people were asking, 'Will this cover electoral fraud or other kinds of scams?' The answer is no. This bill is particularly and solely focused on the nonconsensual sharing of sexually explicit deepfakes online. It will only apply if the deepfakes are sexually explicit. There is of course a set of broader concerns, and we're seeing them play out in various elections around the world now—and no doubt it will be an issue for us in the next set of elections we face in this country. It is a set of broader and legitimate concerns in our democracy about the use of deepfake technology. It's not just politicians; those who want to undermine trust in our institutions can deploy this technology across all spheres of our society. But this is not what this bill is focused on; it is focused on sexually explicit deepfakes that are doing so much harm, particularly to young women and girls.
There is then the question: does this bill cover the mere creation of deepfakes? The answer to that, which some may be a little confused by, is no. It's not actually the creation; it's the nonconsensual sharing that this bill criminalises. The offences in the bill apply to the sharing of nonconsensual deepfake sexually explicit material online. Where the person also created the deepfake that is shared without consent, there's an aggravated offence which carries a higher penalty of up to seven years imprisonment. So an aggravated offence is there; if you create and share without consent, there's a higher penalty. But, arguably, just the creation alone is not something that the Commonwealth can criminalise. It's important to ensure that the new offences are legally robust and that they have a sound basis in the constitutional powers of the federal parliament. The Commonwealth's power to legislate does not extend to the mere creation of sexually explicit adult deepfakes; presumably, that's something the states and territories would have to deal with. There are state and territory laws that may well apply to the creation of deepfakes in different ways in the different states and territories, but the Commonwealth does have the constitutional power to deal with sharing of non-consensual material over telecommunications networks and so on.
The Criminal Code currently criminalises the sharing of private sexual material online. But the definition of 'private sexual material' is potentially limiting and doesn't explicitly extend to artificially generated material, such as deepfakes—hence the legal grey zone, with regard to which this bill makes absolutely clear. The bill proposes to repeal the existing offence related to private sexual material and replace it with a new offence that applies where a person transmits the material using a carriage service—one of those traditional Commonwealth heads of power—and the material depicts: a person who is, or who appears to be, 18 years of age or older; a person engaging in a sexual pose or sexual activity; or sexual organs—or, if we're going into great detail, the anal region of a person or a female's breast, and the person knows that the person depicted does not consent to the transmission of the material, or is reckless as to whether the other person consents.
I touched on one of these items before, but the bill introduces two aggravated offences which apply increased penalties to the offence in two circumstances. Firstly: before the commission of the underlying offence, three or more civil penalty orders were made against the person for contraventions of relevant provisions of the Online Safety Act 2021 and, secondly, where the person was responsible for the creation or alteration of the sexual material transmitted. The government argues, and I hope that every member of the House agrees, that these amendments are essential to ensure that offences can apply both to real material, such as unaltered images and recordings, and also to fake or doctored material that has been created or altered using technology such as deepfakes. The new offences, just to be clear again, will not cover private communications between consenting adults or interfere with private sexual relationships. We don't seek to criminalise consenting human sexuality with this; it's the non-consensual sharing of deepfake material that's of great concern to the Australian people and to the parliament—and particularly to parents, who are grappling with the impacts of this technology and the use, abuse and misuse of this in relation to teenagers. The offences will only apply to material depicting, or appearing to depict, adults. The Criminal Code continues to criminalise the use of carriage services for child abuse material, including child abuse material generated by AI. But of course there are various definition issues, so this complements the existing laws in relation to child abuse material.
I'm very happy to keep talking but I might check at this point, given we have six minutes to go, if we have another speaker? We do. Great—I always love hearing from the member for Forrest!
No comments