Senate debates

Monday, 19 August 2024

Bills

Criminal Code Amendment (Deepfake Sexual Material) Bill 2024; Second Reading

7:01 pm

Photo of Michaelia CashMichaelia Cash (WA, Liberal Party, Shadow Minister for Employment and Workplace Relations) Share this | Hansard source

I rise to speak on the Criminal Code Amendment (Deepfake Sexual Material) Bill 2024. At the outset, I will say that the coalition support the intention behind this bill, but, as we've looked through the bill, through the committee process and otherwise, it has become clear that there is no clear rationale for why the bill actually needs to pass.

The bill repeals existing offence laws that make it an offence to transmit private sexual material with intent to menace, harass or cause offence. In their place, it establishes new offences for transmitting sexual material without consent. This includes material created or altered using technology. This is a one-for-one replacement. The existing laws were introduced by the coalition in 2018 with strong support from Labor. They already deal with both revenge porn and fake sexual material, now known as deepfake material.

Labor now wants to repeal and replace the existing offences that deal with deepfakes and revenge porn. In going through the process, though, Labor has given no persuasive rationale for why we should now repeal laws that it previously supported. The only explanation for the change is a reference to an unexplained legal conclusion that has been put forward by the Commonwealth Director of Public Prosecutions. The issue relates to the legal definition of the term 'private sexual material'. The term refers to persons engaging in or appearing to engage in sexual activity 'in circumstances that reasonable persons would regard as giving rise to an expectation of privacy'. In parliamentary submissions—and this is very important—the CDPP have expressed concern about how that definition deals with fake images. They said:

The issue that arises here is that, with certain deepfakes, it cannot be said that any expectation of privacy attaches to the depiction of a victim. For example, if an accused were to transpose an image of a victim's face onto a publicly available pornographic video, this would generally speaking, not be 'private sexual material'. This is because the creator of the deepfake uses, as source material, a depiction of a part of the victim (for example, their face) with respect to which it cannot be said there would be an expectation of privacy.

The government has adopted that view but the reasoning has never been explained. Why can it not be said an expectation of privacy arises with respect to the image of one's face being attached to pornographic video? In many cases, perhaps the vast majority of cases, people would feel the dissemination of a picture that shows their face attached to a pornographic image is a huge intrusion on privacy. Why would courts not arrive at that commonsense outcome? So far as we can tell, the issue has not been tested in court.

The government have cited a case authority for their conclusion. It appears to be a triumph of technical reasoning over common sense. What is worse, this conclusion appears to be directly contradicted by the explanatory materials about the existing law. The explanatory materials for the existing law make clear the term 'private sexual material' is intended to cover fake material. This is what it says:

The definition for private sexual material captures material that appears to depict a person engaged in sexual activity or a sexual pose. The use of the phrase 'appears to depict' is intended to capture material that has been altered or doctored to falsely portray a person in sexual activity or a sexual pose. For example, an image of a person's face taken from an innocuous image may be transposed onto an image of another person's body in a sexual pose.

In other words, the very example that the CDPP uses to show why there might be a gap the in the law is the same example used in 2018 to explain what the laws do cover. This direct contradiction has never been resolved, and that is a problem because, quite frankly, on a side-by-side comparison—that's something we've done—if you look at the existing law and this particular bill, there are significant drawbacks to the model we're now considering. I will just take you through those drawbacks. It is worth reiterating the coalition fully supports the policy intent of this bill. Our object is to improve the legislation so we get the best outcome for Australians and we will move amendments in that regard.

There are at least five downsides, though, to the replacement of this law. First, the new offence is inconsistent with other laws. Right now, when we use the term 'private sexual material', it has the exact same meaning in both the Criminal Code and the Online Safety Act. The bill removes that consistency, so the civil regime now deals with a different thing to the criminal regime. This means there is now an obvious risk of gaps between our civil and criminal laws when it comes to revenge porn deepfakes and other forms of image based abuse.

Secondly, the new offence is drafted in an exceptionally broad way that risks capturing innocuous behaviour and then reverses the onus of proof, which is of serious concern. We're unable to say with clarity precisely what types of behaviour are criminal offences under the bill because there is no reference to objective standard. This problem is created because the new offence will no longer be based on using a carriage service to menace, harass or offend. By losing that objective reference point, we risk criminalising things like material distributed for educational purposes, material that was consensually distributed in the past, as well as restricted political and satirical speech. There is a clear risk they will be captured by this new law.

Thirdly, even though the government has framed its new offences around the concept of consent—this is something we do need to explore in the committee stage—it is repealing the definition of 'consent'. At present there is a definition of 'consent' in the Criminal Code, which is used in relation to the sharing of private sexual material. The law currently says consent means free and voluntary agreement. But, in redesigning the existing criminal offence, this bill adopts a bizarre and contradictory approach. Even though consent will be the central element of the new criminal offence, the bill repeals the definition that explains what consent means. The explanatory memorandum says consent takes its ordinary meaning. But there is only one sentence explaining what the ordinary meaning is, and that explanation is limited to a single specific circumstance. Courts will now need to decide what consent means, so the courts and not the parliament will decide the scope of the conduct that is now criminal. This introduces uncertainty into the law, which is not where we want to be. As it stands, it would appear that, based on the drafting, we can't clearly outline the circumstances in which this new offence will apply.

The fourth issue that we need to explore in committee is that the government, for some unexplained reason, has changed the definition of 'recklessness' for this offence. Recklessness is one of the fault elements of the offence. One of the core ways that this new offence will be proven in court is by showing that an accused was reckless about whether a victim consented to sharing an image. If a person meets the standard of recklessness, they will be morally culpable—in other words, they can be punished by law. But, in just about every other part of the Criminal Code, recklessness has a standard meaning. This is by design. The Criminal Code is meant to be consistent. It lists just four standard fault elements: intention, knowledge, recklessness or negligence. For some unexplained reason, however, for this offence, the government has given 'recklessness' a different meaning. We've seen now that new words have been added which change the way the fault element applies. This is untested and inadequately explained, and, once again, we're going to have to explore this because we don't know what the consequences will be.

The fifth downside is that we increase the risk of negative experiences for victims. Under the existing offences, you don't need to prove consent or a lack thereof. This means there is far less need to cross-examine a victim about the material which shows them engaging in sexual activity, whether it's real, as with revenge porn, or fake, as with AI generated deepfakes. However, under this new legislation, the prosecution will need to prove consent as an element of the offence. The obvious consequence is that you will have the defence challenging the prosecution on the issue of consent. In practice, this will likely mean that you're going to have victims—and they'll be overwhelmingly women—being cross-examined about the ins and outs of whether and how they consented to particular material, and this risk arises both in relation to deepfakes and also in relation to revenge porn.

One of the other things that we need to actually explore is: Who was crying out for this? Who was saying that we need to replace an existing law that deals with these issues adequately with one that is now more likely to result in victims being dragged through the justice system? Again, we need to explore what the reasons are for this. In short, instead of a clear workable offence that covers both deepfakes and revenge porn, this bill gives us now, particularly for the victims, an uncertain mess.

There are two pathways forward to deal with this legislation. The first way is to deal with this and accept the government's legislation as is. This means a complete redesign of laws that are already on the statute books, even though the rationale has never been clearly explained and there are clear disadvantages. The second way forward—and I would hope that this is what the Senate reflects on—is to improve the legislation. This means acknowledging we have legislation on the statute books right now to deal with both fake material and so-called revenge porn. It involves adding clarity by directly addressing the concern that the existing laws might not apply to deepfakes. This means adding short provisions to expressly make clear the existing laws cover both deepfakes and so-called revenge porn. It is a simple matter of removing all doubt that the current laws do what they were always intended to do. The second option is the coalition's preferred approach. It's the option that we've put on the table.

As we've said consistently, we support the intent of this legislation. We will work constructively to ensure that deepfakes can be addressed, as we have been doing since 2018. We support and will not stand in the way of the legislation, but, as I've outlined, we have some serious concerns and we will move amendments which we think will improve the bill, given the very legitimate concerns that I have raised, particularly in relation to the cross-examination of victims and potentially dragging them through the courts. The obvious consequences, as I said, of the way this legislation has now been drafted is that you will have the defence challenging the prosecution on the issue of consent. In practice, this will likely mean that you are going to have victims, overwhelmingly women, being cross-examined about the ins and outs of whether and how they consented to particular material being transmitted.

As I said, there is a way forward. We will be putting amendments up to address these issues, and I certainly ask the Senate to give consideration to the coalition's approach to addressing the legislation, which will minimise the impact, in particular, on victims.

Comments

No comments