House debates

Tuesday, 25 June 2024

Bills

Criminal Code Amendment (Deepfake Sexual Material) Bill 2024; Second Reading

12:53 pm

Photo of Paul FletcherPaul Fletcher (Bradfield, Liberal Party, Shadow Minister for Government Services and the Digital Economy) Share this | | Hansard source

I rise to speak on the Criminal Code Amendment (Deepfake Sexual Material) Bill 2024. At the outset, I will make it clear that the coalition supports the intention behind this bill. This is a bill that seeks to deal with the harmful consequences of artificial intelligence generated sexual material.

In many ways the proliferation of artificial intelligence creates tremendous possibilities, but it is a technology that comes with risks. It is our role to consider those risks and adapt our laws to deal with them. The risks we are dealing with here arise when the tremendous capabilities of artificial intelligence intersect with the capacity to inflict harm online. It is a difficult issue. We know that artificial intelligence can produce extraordinary material. Many of us will have seen the photorealistic images that artificial intelligence can create showing places that don't exist or events that never happened.

If I were to go online now and give this speech into an artificial intelligence voice generator, I could choose to have it delivered in the voice of Winston Churchill or Frank Sinatra. Some would say that would not be the highest and best use of that technology, but I could. But the technology can obviously be misused, and, in the online world, the potential adverse impacts are massively amplified, so it is appropriate that, from time to time, we revisit our laws to confirm whether they are fit for purpose. This bill repeals existing offences laws that make it an offence to transmit private sexual material with intent to menace, harass or cause offence. In their place, it establishes new offences for transmitting sexual material without consent. This includes material created or altered using technology. So the immediate question is: what is the problem that these laws fix? To understand that issue, it is worth spending a few moments looking at the current laws.

The current laws represent the outcome of many years work. They were passed by the previous coalition government but, importantly, reflect the position taken in this parliament, which was supported by parties across the political spectrum. The core issue that the current laws deal with is the use of a carriage service to disseminate sexual material in a way that is intrusive and damaging. Whatever the term used—'nonconsensual sharing of intimate images', 'image based abuse', 'revenge porn' or 'transmission of private sexual material'—the common thread is that our laws should not permit carriage services to be used to distribute sexual material in a way that is unacceptable to the community, and that is important because, when we turn to this deepfake sexual material bill, we see that it also addresses that same core issue.

This bill repeals existing laws and replaces them with new ones that largely do the same thing, and that raises these questions: how did we get here, what has changed and why are these new laws necessary? To turn to how we got here, the Commonwealth Criminal Code has, for some 20 years, included provisions that make it an offence to use a carriage service to menace someone, to harass them or to cause offence, and you can broadly interpret 'carriage service' as meaning the internet. These provisions are in division 474 of the Criminal Code and are broad offences, designed to apply in a wide range of circumstances. Nevertheless, as technology has changed, we have updated our laws.

In 2018, when the coalition was in government, the parliament passed the Enhancing Online Safety (Non-consensual Sharing of Intimate Images) Bill. The bill was designed to address an obvious gap in the regulatory framework that existed at that time and to make something highly desirable to compliment the criminal law—that is, an effective civil take-down regime. It was an acknowledgement that, in many cases, what a victim wants is for the offensive material to be taken down. Victims welcome criminal penalties, but they don't want to wait for a lengthy criminal trial. Overwhelmingly, what they want is to have the material taken down as quickly as possible. At the time we introduced the bill, we expressly acknowledged that it addressed a shortfall or a gap in the existing regulatory framework and it was designed to complement the criminal law. We were also very clear that the Commonwealth was concurrently progressing work with the states and territories to develop consistent national principles concerning the non-consensual sharing of intimate images. That work in the criminal law space was methodical, careful and painstaking and was done on a cooperative basis nationally.

The Labor Party, then in opposition, did try to front run on the issue and tried to push ahead with amendments that were ultimately found to be problematic. At the time, we were upfront in saying that what the Labor Party was doing had the effect of delaying both the take-down scheme and the updates to criminal law. But, ultimately, it is a matter of record that Labor and the coalition worked together, along with the crossbench, to produce a regime that established not just civil take-downs but also updated criminal offences. These laws were endorsed across the political spectrum.

These laws established take-down powers for intimate images. In terms of criminal offences, they established the existing offences that deal with the dissemination of private sexual material online. These are laws that the Labor member for Lyons told the parliament reflect 'Labor's clear and longstanding position'. Importantly, they are also laws that deal with fake images. It is worth reading from the explanatory materials which explain the offences introduced in 2018, which said:

The definition for private sexual material captures material that appears to depict a person engaged in sexual activity or a sexual pose. The use of the phrase 'appears to depict' is intended to capture material that has been altered or doctored to falsely portray a person in sexual activity or a sexual pose. For example, an image of a person's face taken from an innocuous image may be transposed onto an image of another person's body in a sexual pose.

It is clear that the existing offences deal with private sexual material that is doctored, fake or simply untrue. In other words, the existing law says it is an offence to create a fake sexual image of a person, whether you do it using artificial intelligence or through your own Photoshop skills, and then distribute it online where you would ordinarily expect the image to be private. That is the current law.

So what does this bill do? As I mentioned previously, it takes the strange approach of repealing the existing offences and replacing them with new offences that appear to do the same thing. The immediate question this raises is: why? Why is Labor departing from laws which, at the time, it described as reflecting Labor's 'clear and longstanding position'. References to deepfakes and the power of generative artificial intelligence do not provide an explanation. The current laws deal with fake images just as these proposed new laws would do.

The closest thing we can find stating a reason for what Labor has decided to do here is a single paragraph in the explanatory memorandum. It is a technical explanation. It is based on the fact that the existing offences relate to 'private sexual material', which is a defined term. It's defined to be sexual material in which there is a reasonable expectation of privacy. The explanatory memorandum for this bill says:

The issue that arises when dealing with such material under the current framework is that because the victim is not involved in the creation of the fictional 'deepfake' version of themselves, an expectation of privacy may not attach to the depiction of the victim. This issue does not arise with the new offences, which do not rely on this definition and instead turn on whether the person depicted in the material consents to its transmission.

With respect, this is an argument that does not make sense. Firstly, it is contradicted by the explanatory materials for the existing laws. It could not have been made clearer that the existing laws deal with fake images that a victim has no role in creating. Secondly, it is not consistent with the approach taken in other Commonwealth legislation. Under the Privacy Act, we attach privacy obligations to facial images that identify a person. Why would we not do the same in the Criminal Code? Thirdly, the government has not pointed to any body of evidence that the existing laws are not working. It is not immediately apparent that these new laws cover any situation that is not already covered. Fourthly, if the technical concern about an expectation of privacy is the only justification for the new laws, why has the government not simply proposed a short provision which clarifies the existing law? Why not simply make clear that it is generally reasonable for a person to expect privacy for any sexual material that includes their likeness, whether it is deepfake material or not?

The government is rightly concerned about deepfake sexual material. So is everyone. But being concerned about an issue is not the same thing as explaining why the existing law needs to change, and the government has not done an adequate or satisfactory job of providing that explanation.

That brings me to the potential unintended consequences of this bill. It does have several strange features. For example, this bill says the new offences are based on the distribution of material without a person's consent. But this bill also repeals the definition of consent. That is a very strange thing to do. Right now, the law says, 'Consent means free and voluntary agreement.' The government is getting rid of that definition. Why? Is the government suggesting that consent to share deepfake sexual material can be implied? Can consent be understood from the circumstances? How do the offences work if a person withdraws their consent? And what does that mean when you go to prosecute these laws? Will we have victims being cross-examined about whether they consented to their image being used?

These laws are also very broad. They apply to deepfakes but they also apply to true images and to obvious fakes like cartoons. Why? There are other questions: Do these new offences apply to historical figures? How do these laws interact with child pornography offences? Some of the interactions are unclear.

It seems this law is drafted backwards. Under the current law, a prosecutor must show the material is offensive. There is no equivalent under the new law. Under the new law, it falls to the defendant to show, on the balance of probabilities, that the material is acceptable. Why?

More seriously, for these offences the government has changed the definition of 'recklessness' when it comes to consent. 'Recklessness' has a defined meaning that applies in almost every other part of the Criminal Code but not here. Why not? What are they capturing that they could not capture previously, and what are the consequences of that change?

As I have said, the coalition supports the intention of this bill. We will not oppose the bill in this place. That's unsurprising. The underlying issue that this bill addresses is the same issue that we addressed with our 2018 legislation. The coalition agrees that the use of carriage services to distribute deepfake sexual material is a crime. That is why we made it a crime in our 2018 legislation. But we will ask that this bill be closely scrutinised, because we all have an interest in ensuring that whatever laws we end up with are fit for purpose. I thank the House.

Debate adjourned.