Senate debates

Wednesday, 21 August 2024

Bills

Criminal Code Amendment (Deepfake Sexual Material) Bill 2024; Second Reading

10:14 am

Photo of Kerrynne LiddleKerrynne Liddle (SA, Liberal Party, Shadow Minister for Child Protection and the Prevention of Family Violence) Share this | Hansard source

One, two, three, four seconds: that's the time it takes to create a deepfake. In four seconds what could come next can be catastrophic. Deepfake imagery can be career harming for adults, but when used by criminals against our children the consequences have been and can be deadly. This bill recognises that improvement is needed to keep up with new developments in deepfake, but this bill does not make the existing legislation better. Instead, it introduces new legislation with significant gaps for vulnerable people and/or criminals to fill. It is becoming increasingly difficult to identify deepfakes and that makes this legislation incredibly important.

There is criticism of this bill from important stakeholders. The Law Council of Australia, in its submission, complained about Labor's process for consultation, declaring:

… expert advisory committees have not been able to consider completely all the issues raised by this Bill.

You would think the Law Council was a relevant contributor with a view worth taking very seriously, but it appears not. Australia's eSafety Commissioner has estimated deepfake imagery has soared by as much as 550 per cent year on year since 2019. Pornographic videos make up 98 per cent of the deepfake material currently online; 99 per cent of that deepfake imagery is of women and girls.

As shadow minister for child protection and the prevention of family violence, I have heard from parents whose lives have been shattered by the depravity and sheer evil of these online scammers. These criminal scammers use deepfakes and nude images to extort money from young children in a manner called 'sextortion'. Sextortion works like this. Criminals, maybe online and maybe in another country, prey on children through social media platforms and demand their money. They trick children into sending a compromising photo and threaten to leak those photographs, or the extorter threatens to send a fake photo to everyone in the recipient's contacts—schools, clubs and anywhere they've found out the young person is connected to.

Male children and young people online during the school holidays, likely with more cash or gift cards than usual, are targeted most. They are more than 90 per cent of victims. The Australian Centre to Counter Child Exploitation has seen a 100-fold increase in reports of sextortion crimes in 2023 compared to the previous year. Latest data from November 2023 shows around 300 reports of sextortion targeting children each month. Globally, sextortion is reported to have doubled from 2022 to 2023. I encourage everyone—carers, parents, grandparents, teachers—to know more about protecting our children from sextortion and to check out the ACCCE website for information.

A few months ago, News Ltd Australia brought a group of parents to Canberra as part of its Let Them Be Kids campaign, calling for children under 16 to be restricted from having social media accounts. As part of the campaign, a group of courageous parents, consumed by terrible grief, shared their stories of the loss of their children as a consequence of sextortion. The harrowing stories of these parents are real. Their stories are horrific.

Susan McLean, widely regarded as Australia's cyber cop, is quoted as saying that in her 30 years of policing she has never seen a crime type that tips mentally well young people to crisis and self-harm as quickly as sextortion does. One parent who is taking action is Wayne Holdsworth, who tragically lost his 17-year-old son, Mac, in October last year after being targeted by predators. Wayne created SmackTalk, doing all he can to reduce suicide by educating people on how to be better listeners.

Research released in June found one in seven adults, or 14 per cent, has had somebody threaten to share their intimate images. And if a worldwide survey involving Australia that found more than 70 per cent of people don't know what a deepfake is, then we need to do more to educate Australians. We need to ensure that young people understand how wrong it is and the harm it causes. We must do more to ensure children do not become victims or, indeed, perpetrators.

In July a teenager was arrested amid allegations that the faces of 50 female students in a Melbourne school were used in fake nude images generated using AI technology. No matter how deepfakes reveal themselves, whether the images shared online were doctored or are real photos unauthorised by victims, they cause harm. As a mum, my children were not allowed to use social media until their late teens. It was very unpopular to take that position, but I saw it as essential. Putting in place controls was the right thing for our family. But, on reflection, it would have been easier with another ally in the fight to protect them. This legislation goes some way to doing that but it could and should be better.

I went online and searched 'create AI-generated image'. In seconds, mere seconds, the search result was pages and pages offering easy guides to help me create deepfakes. There is need for education on online safety for people in all stages of life. These protectors include: agreeing on boundaries, reviewing controls, privacy and location settings, talking openly about the dangers and tactics of trolls and scammers, knowing where support can be found, and reporting abuse, crime and unwelcome behaviour.

Outside the personal life of Australians, deepfake has also had an impact on the business sector. Research released in July found almost 25 per cent of Australian businesses have experienced a deepfake information security incident within the last 12 months—25 per cent of them! This is not to say that AI is all bad. When used well, it can be used for significant good, but there has always been the potential for people with malicious intent, known as bad actors, in the industry, to turn it to their advantage. We need a strong deterrent for those bad actors. When opposition leader Peter Dutton was Home Affairs minister he funded and opened the $70 million Australian Centre to Counter Child Exploitation. Since then, the ACCCE has removed more than 500 children from harm. A coalition government will double the size of the Australian Centre to Counter Child Exploitation because there is evidence of its good work. In 2018, the coalition's Enhancing Online Safety (Non-consensual Sharing of Intimate Images) Bill established crucial powers, now exercised by the eSafety Commissioner, to keep Australian children safe.

Those opposite shouldn't be introducing weak legislation that is inconsistent with the Online Safety Act, that is too broad and that risks capturing innocent victims and placing the onus of proof on them. There is no need for legislation that amends the definition of 'consent', leaving too much room to judicial interpretation. We don't want to amend the definitions of 'recklessness' and 'unnecessary exception', and we don't want to increase exponentially the risk of harm to the victim, increasing the need to cross-examine victim-survivors.

The bill is flawed. It is well intentioned but ill-informed. The Albanese government should not be allowed to keep the bill as it presents to the chamber but instead make the necessary changes to improve the legislation, to increase safety for all Australians.

Comments

No comments