Senate debates

Wednesday, 21 August 2024

Bills

Criminal Code Amendment (Deepfake Sexual Material) Bill 2024; Second Reading

10:14 am

Photo of Kerrynne LiddleKerrynne Liddle (SA, Liberal Party, Shadow Minister for Child Protection and the Prevention of Family Violence) Share this | | Hansard source

One, two, three, four seconds: that's the time it takes to create a deepfake. In four seconds what could come next can be catastrophic. Deepfake imagery can be career harming for adults, but when used by criminals against our children the consequences have been and can be deadly. This bill recognises that improvement is needed to keep up with new developments in deepfake, but this bill does not make the existing legislation better. Instead, it introduces new legislation with significant gaps for vulnerable people and/or criminals to fill. It is becoming increasingly difficult to identify deepfakes and that makes this legislation incredibly important.

There is criticism of this bill from important stakeholders. The Law Council of Australia, in its submission, complained about Labor's process for consultation, declaring:

… expert advisory committees have not been able to consider completely all the issues raised by this Bill.

You would think the Law Council was a relevant contributor with a view worth taking very seriously, but it appears not. Australia's eSafety Commissioner has estimated deepfake imagery has soared by as much as 550 per cent year on year since 2019. Pornographic videos make up 98 per cent of the deepfake material currently online; 99 per cent of that deepfake imagery is of women and girls.

As shadow minister for child protection and the prevention of family violence, I have heard from parents whose lives have been shattered by the depravity and sheer evil of these online scammers. These criminal scammers use deepfakes and nude images to extort money from young children in a manner called 'sextortion'. Sextortion works like this. Criminals, maybe online and maybe in another country, prey on children through social media platforms and demand their money. They trick children into sending a compromising photo and threaten to leak those photographs, or the extorter threatens to send a fake photo to everyone in the recipient's contacts—schools, clubs and anywhere they've found out the young person is connected to.

Male children and young people online during the school holidays, likely with more cash or gift cards than usual, are targeted most. They are more than 90 per cent of victims. The Australian Centre to Counter Child Exploitation has seen a 100-fold increase in reports of sextortion crimes in 2023 compared to the previous year. Latest data from November 2023 shows around 300 reports of sextortion targeting children each month. Globally, sextortion is reported to have doubled from 2022 to 2023. I encourage everyone—carers, parents, grandparents, teachers—to know more about protecting our children from sextortion and to check out the ACCCE website for information.

A few months ago, News Ltd Australia brought a group of parents to Canberra as part of its Let Them Be Kids campaign, calling for children under 16 to be restricted from having social media accounts. As part of the campaign, a group of courageous parents, consumed by terrible grief, shared their stories of the loss of their children as a consequence of sextortion. The harrowing stories of these parents are real. Their stories are horrific.

Susan McLean, widely regarded as Australia's cyber cop, is quoted as saying that in her 30 years of policing she has never seen a crime type that tips mentally well young people to crisis and self-harm as quickly as sextortion does. One parent who is taking action is Wayne Holdsworth, who tragically lost his 17-year-old son, Mac, in October last year after being targeted by predators. Wayne created SmackTalk, doing all he can to reduce suicide by educating people on how to be better listeners.

Research released in June found one in seven adults, or 14 per cent, has had somebody threaten to share their intimate images. And if a worldwide survey involving Australia that found more than 70 per cent of people don't know what a deepfake is, then we need to do more to educate Australians. We need to ensure that young people understand how wrong it is and the harm it causes. We must do more to ensure children do not become victims or, indeed, perpetrators.

In July a teenager was arrested amid allegations that the faces of 50 female students in a Melbourne school were used in fake nude images generated using AI technology. No matter how deepfakes reveal themselves, whether the images shared online were doctored or are real photos unauthorised by victims, they cause harm. As a mum, my children were not allowed to use social media until their late teens. It was very unpopular to take that position, but I saw it as essential. Putting in place controls was the right thing for our family. But, on reflection, it would have been easier with another ally in the fight to protect them. This legislation goes some way to doing that but it could and should be better.

I went online and searched 'create AI-generated image'. In seconds, mere seconds, the search result was pages and pages offering easy guides to help me create deepfakes. There is need for education on online safety for people in all stages of life. These protectors include: agreeing on boundaries, reviewing controls, privacy and location settings, talking openly about the dangers and tactics of trolls and scammers, knowing where support can be found, and reporting abuse, crime and unwelcome behaviour.

Outside the personal life of Australians, deepfake has also had an impact on the business sector. Research released in July found almost 25 per cent of Australian businesses have experienced a deepfake information security incident within the last 12 months—25 per cent of them! This is not to say that AI is all bad. When used well, it can be used for significant good, but there has always been the potential for people with malicious intent, known as bad actors, in the industry, to turn it to their advantage. We need a strong deterrent for those bad actors. When opposition leader Peter Dutton was Home Affairs minister he funded and opened the $70 million Australian Centre to Counter Child Exploitation. Since then, the ACCCE has removed more than 500 children from harm. A coalition government will double the size of the Australian Centre to Counter Child Exploitation because there is evidence of its good work. In 2018, the coalition's Enhancing Online Safety (Non-consensual Sharing of Intimate Images) Bill established crucial powers, now exercised by the eSafety Commissioner, to keep Australian children safe.

Those opposite shouldn't be introducing weak legislation that is inconsistent with the Online Safety Act, that is too broad and that risks capturing innocent victims and placing the onus of proof on them. There is no need for legislation that amends the definition of 'consent', leaving too much room to judicial interpretation. We don't want to amend the definitions of 'recklessness' and 'unnecessary exception', and we don't want to increase exponentially the risk of harm to the victim, increasing the need to cross-examine victim-survivors.

The bill is flawed. It is well intentioned but ill-informed. The Albanese government should not be allowed to keep the bill as it presents to the chamber but instead make the necessary changes to improve the legislation, to increase safety for all Australians.

10:23 am

Photo of David ShoebridgeDavid Shoebridge (NSW, Australian Greens) Share this | | Hansard source

I first of all rise to acknowledge the, woe of my colleague, Senator Waters, in this space, for her work on the committee and her contribution to the chamber, and endorse whole heartedly her contributions and credit her work on the committee. This is a bill that does a very, very small thing. It extends an existing offence about the transmission of sexual material without consent, which has been on the statute books for years and years to also include material that was produced using technology including deepfakes. That's what it does. You would think, from the comments from the Prime Minister down in the government, that this was some extraordinarily significant achievement from the government.

The Greens have been on record as saying that they support this change and support this incremental change to the existing offence of using a carriage service to transmit sexual material without consent. We support that, but we really could have legislated for this in an afternoon with the broad support of all parties and a very rapid committee process, because the change is, to be quite frank, extremely marginal in terms of the impact it's going to have.

One of the concerns that has been repeatedly raised in this space with my office is that the existing offence under the Criminal Code of using a carriage service to transmit sexual material without consent is almost never policed, and another is that, when women make complaints to the police, at a state level they get told it's a federal matter, and for a federal matter they get told it's really a state matter. The police seem largely disinterested in investigating. There don't seem to be any good guidelines for how the police should go about acquiring the relevant information needed to find out where the offence was committed. There don't seem to be the specialist resources in place to assist state, territory or federal police to undertake the necessary investigative work to work out from what device and by who the material was sent. None of that seems to be in place. It would be good to see the government investing in those resources, asking those questions of the Australian Federal Police and, through the standing committee on police ministers and the attorneys-general, getting those questions asked of state and territory police. That's the hard work of government that needs to be done in this space, but we're not getting that. We got this bill as though it's a solution.

The concerns that I have are that we'll pass the law and the government will say they've taken this action and that, somehow, this passing of the law itself will make women safe in this space, but history would suggest that that, of itself, won't do it. The policing culture needs to change to respect women when they come and make complaints, and I would refer the chamber to the report that was released last week on missing and murdered First Nations women and children and the additional comments to that report given by my colleague Senator Cox and me, which pointed out the deep disrespect police can show women when they're reporting their missing children, the physical assaults that they have received or the violent threats that they've received. Does anyone seriously think that there's a huge gap—that there's suddenly a totally different set of police forces across the country who are respectfully responding to women when they talk about their images being shared without consent online? I don't believe there is any meaningful difference.

So, yes, let's amend this bill. Let's amend this law. Let's criminalise the non-consensual sharing of deepfakes. Let's do that, where it's involving sexual material without consent. Let's do that, but don't anybody pretend that the problem's fixed with the passing of this bill. Don't even pretend that it's fixed with passing this bill.

I also move the second reading amendment that's been circulated in my name—as I'm corrected by my august leader in this chamber, I foreshadow that I will be moving a second reading amendment as circulated in my name.

10:29 am

Photo of Murray WattMurray Watt (Queensland, Australian Labor Party, Minister for Agriculture, Fisheries and Forestry) Share this | | Hansard source

I thank my parliamentary colleagues for their contribution to debate on the Criminal Code Amendment (Deepfake Sexual Material) Bill 2024. This bill will modernise and strengthen criminal offences targeting the non-consensual sharing of artificially generated and real sexually explicit material online.

Digitally created and altered sexually explicit material that is shared without consent is a damaging and deeply distressing form of abuse. The rapid development of artificial intelligence has allowed perpetrators to create extremely realistic but false depictions of real people engaging in sexually explicit acts, known as deepfakes, and share these online. This insidious behaviour is degrading, humiliating and dehumanising victims. Such acts are overwhelmingly targeted at women and girls and perpetuate harmful gender stereotypes and gender based violence. This bill will strengthen the criminal law to protectable vulnerable people from online harm and abuse and to hold perpetrators to account.

The government welcomes the unanimous recommendation of the Senate Legal and Constitutional Affairs Legislation Committee that the Senate urgently pass the Albanese government's bill to ban the sharing of non-consensual, deepfake sexually explicit material.

The Senate Legal and Constitutional Affairs Legislation Committee made three other recommendations. The first is:

… that the Attorney-General reviews the threshold outlined in proposed subsection 474.17AA(1) after two years of the Bill's operation.

The second is:

… that the Attorney-General continues work already underway via the Standing Council of Attorneys-General in relation to development of harmonised offences … for the:

      The third is that education ministers continue to progress their work to strengthen respectful relationships in schools. There is existing work underway to implement version 9 of the Australian curriculum and there's work by schooling systems to implement age and developmentally appropriate programs on consent and online safety within the context of respectful relationships. The government is considering each of these recommendations and will provide its response in due course.

      The bill amends the Criminal Code Act 1995 to modernise and strengthen criminal offences for the non-consensual sharing of simulated and real sexual material online. It will create a new criminal offence that applies where a person transmits sexual material depicting an adult person, using a carriage service, and the person knows the person depicted does not consent to the transmission of the material or is reckless as to whether the other person consents. The new offence will carry a maximum penalty of six years imprisonment.

      The bill also introduces two new aggravated offences. The first aggravated offence applies where the person transmitting the material is also responsible for creating or altering the material. The second aggravated offence applies where a person has already been found liable on three occasions for similar conduct at the civil standard under the Online Safety Act 2021. These aggravated offences carry a maximum penalty of seven years imprisonment to reflect the seriousness of this offending. The bill sets out specific defences to ensure these offences are targeted and proportionate.

      I might address some matters that have been raised in the debate, before summing up. The bill ensures criminal offences relating to non-consensual sharing of sexual material apply to both real and fake material, including deepfakes. The bill repeals previous offences in the Criminal Code dealing with non-consensual sharing of private sexual material. Those existing offences do not adequately cover the situation where deepfake sexual material is shared online without consent. This was the clear advice of the Commonwealth Director of Public Prosecutions to the Senate Legal and Constitutional Affairs Legislation Committee's inquiry into this bill and to the Parliamentary Joint Committee on Law Enforcement in December 2023.

      The new criminal offences remedy this defect and create a strong framework to criminalise the non-consensual sharing of sexual material online. The new criminal offences are based on a consent model to better cover both artificial and real sexual material. Consent is not defined in the legislation and relies on its ordinary meaning, which is understood to be 'free and voluntary agreement'. A person is taken to have consented if the person freely and voluntarily agrees to the sharing of the material. It would not apply where consent was obtained through fear, force or deception.

      I want to thank Senator David Pocock for his contribution to the debate. The second reading amendment proposed by Senator Pocock relates to electoral fraud and scams. This is outside the scope of this bill, which is focused on criminalising the non-consensual sharing of sexually explicit material online. As such, the government does not support this amendment, although we remain committed to continuing to improve the transparency and accountability of Australia's electoral system. This includes asking the Joint Standing Committee on Electoral Matters to consider how governments and the community can prevent or limit inaccurate or false information from influencing electoral outcomes, particularly with regard to artificial intelligence, foreign interference, social media and mis- and disinformation.

      I also thank Senator Waters and the Greens for their contribution. This bill creates strong criminal offences for the transmission of sexually explicit deepfakes using a carriage service that are in line with the Commonwealth's power to legislate. There is an aggravated offence where the person transmitting the material is also responsible for creating the material. Criminalising the mere creation of such material is a matter for the states and territories. The Commonwealth has engaged with states and territories at National Cabinet and in the Standing Council of Attorneys-General on measures to ban such material. In light of this, the government does not support the second reading amendment proposed by Senator Waters.

      In conclusion, this bill will assist to protect the community from the growing trend of emerging online harms and will send a clear message to perpetrators that serious penalties apply to those who create and distribute real or fake sexual material without consent. This is another sign that the Albanese government is committed to keeping Australians safe from technology facilitated abuse.

      I commend the bill to the chamber.

      Photo of Hollie HughesHollie Hughes (NSW, Liberal Party, Shadow Assistant Minister for Mental Health and Suicide Prevention) Share this | | Hansard source

      The question is that the second reading amendment moved by Senator Waters be agreed to.

      10:43 am

      Photo of David PocockDavid Pocock (ACT, Independent) Share this | | Hansard source

      I move my second reading amendment:

      At the end of the motion, add ", the Senate:

      (a) notes that:

      (i) deepfakes present inherent risks including the spread of misinformation, threats to privacy and security, altering election outcomes and misleading and deceiving voters,

      (ii) Australians have lost over $8 million to scams linked to online investment trading platforms, often using deepfakes,

      (iii) the Australian Electoral Commissioner has warned that Australia is not ready for its first artificial intelligence election that is coming within the next 12 months,

      (iv) deepfakes of Australian politicians have already been used, and

      (v) the Commonwealth Electoral Amendment (Voter Protections in Political Advertising) Bill 2023, introduced by the Member for Warringah on 13 November 2023, would ban the use of deepfakes in political advertising; and

      (b) calls on the Government to:

      (i) ban the use and transmission of deepfakes in political advertising prior to the next federal election, and

      (ii) further regulate the use and transmission of deepfakes to address the risks they present".

      Photo of Hollie HughesHollie Hughes (NSW, Liberal Party, Shadow Assistant Minister for Mental Health and Suicide Prevention) Share this | | Hansard source

      The question is that the second reading amendment moved by Senator David Pocock be agreed to.

      10:45 am

      Photo of David ShoebridgeDavid Shoebridge (NSW, Australian Greens) Share this | | Hansard source

      I move the second reading amendment circulated in my name:

      At the end of the motion, add ", but the Senate:

      (a) notes that:

      (i) deepfakes present serious risks to democracy including through the spread of misinformation, and misleading and deceiving voters,

      (ii) the next federal election will likely see the proliferation of political deepfakes, with authorities including the AEC currently not having powers to remove these or penalise those involved,

      (iii) jurisdictions overseas have begun to grapple with the challenge posed by political deepfakes including South Korea where the National Assembly prohibited AI-generated deepfake content in political campaigning within 90 days of their April 2024 election,

      (iv) deepfakes of Australian politicians have already been created and circulated; and

      (b) calls on the Government to work across Parliament to ban political deepfakes in the pre-election period leading up to the next federal election with an exemption for clearly labelled deepfakes used for parody, satire, criticism, news reporting or legal proceedings".

      Photo of Hollie HughesHollie Hughes (NSW, Liberal Party, Shadow Assistant Minister for Mental Health and Suicide Prevention) Share this | | Hansard source

      The question is that the second reading amendment on sheet 2767 moved by Senator Shoebridge be agreed to.

      10:48 am

      Photo of Hollie HughesHollie Hughes (NSW, Liberal Party, Shadow Assistant Minister for Mental Health and Suicide Prevention) Share this | | Hansard source

      The question now is that this bill be read a second time.

      Question agreed to.

      Bill read a second time.