House debates

Tuesday, 26 November 2024

Bills

Online Safety Amendment (Social Media Minimum Age) Bill 2024; Second Reading

12:51 pm

Photo of Andrew WallaceAndrew Wallace (Fisher, Liberal National Party) Share this | Hansard source

I rise to speak on the Online Safety Amendment (Social Media Minimum Age) Bill 2024 with a mix of relief and frustration. I'm relieved that this Labor government has finally legislated what the coalition, the Leader of the Opposition, the member for Flinders and I have been fighting for over a very long time, and I include in that the member for Banks, the shadow communications minister. I'm relieved on behalf of the desperate parents who have called for the government to step up on this issue, but I'm frustrated that it has taken so long, I'm frustrated with the lack of conviction from this government and I'm frustrated with the way that ill-informed proponents are seeking to conflate this legislation with other issues, such as parents' rights and privacy, which are, of course, very important.

The member for Flinders and I, along with others, have just concluded an historic inquiry into social media through the Joint Select Committee on Social Media and Australian Society. Over the course of almost a year, the committee heard testimony from over 200 families, experts, some so-called experts and victims-survivors, the majority of whom support age assurance as one tool to help keep kids safe online. The fact is that children should not be required to keep themselves safe on platforms which are inherently dangerous and which they lack the developmental capacity to navigate safely.

We know that parents are the best judges of how their children should be raised. But what parents told us throughout the course of the inquiry is that they are at a loss as to what to do and how to navigate their parenting through this journey. It is plain to see from the evidence provided to the committee that big tech cannot be trusted to self-regulate in the interests of Australian users, particularly Australian children. The eSafety Commissioner said: 'Harder edged regulation is what's necessary. I don't know that there's anyone that can credibly say self-regulation has worked.' Reset Tech Australia warned that harm happens as governments wait for self-regulation and co-regulation to fail. The simple fact of the matter is that what we have been doing is not working. We need change, we needed change and we needed it yesterday.

Now, those that would say that the status quo should continue are either living in a delusion or they are complicit. Research highlighted in the roadmap for age verification showed that nearly half of all 16-to-18-year-olds first encountered pornography before the age of 16, over a third of them through social media feeds, ads, messages and group chats. This is not the pornography of yesteryear; this is hardcore, violent, misogynistic pornography which is impacting upon what young people consider constitutes a normal sexual relationship. Early exposure to pornography can significantly harm a young person's sexual development and their mental health.

In a public hearing on 28 June 2024, Tiktok representative Ms Woods-Joyce claimed that there is no pornography on Tiktok. On the same day, Ms Antigone Davis appeared on behalf of Meta as its vice president and global head of safety. She said, 'We don't have pornography on our site.' Let me just correct that statement. It is not just pornography. Paedophiles, predators, criminal gangs are using social media to sexually exploit and abuse young Australians, especially our young boys.

In the 2022-23 financial year, the ACCCE, the Australian Centre to Counter Child Exploitation, received over 40,000—40,000!—reports of child sexual exploitation. The AFP charged 186 offenders with 925 child exploitation related offences. In the first six months of 2024 alone, the ACCCE received 560 reports of sextortion. These are just the reported cases. What about the thousands of cases where young men, young Australians and families who were too embarrassed to report them to authorities? The AFP shut down over 1,800 bank accounts linked to offshore organisation sextortion gangs. Evidence supplied to the committee named Meta, Snapchat, Tiktok, WhatsApp, Skype, Discord, telegram and so many other social media and digital platforms as facilitating abuse by predators. When asked how many child sexual abuse material reports were made by Australian end-users, platforms redirected their answers, proving adept at the politicians' pivot. The obfuscation and opaque responses to questions from this parliament showed that you cannot trust big tech to keep kids safe from sexual harm.

At the same time, the use of algorithms is driving mental and physical ill-health. These social media companies advertise alcohol and vapes with targeted advertising. They exacerbate eating disorders and body image issues through fitspo and fad diets. They market gambling products, promote radicalisation and extremism, and facilitate foreign interference and antisemitism. We heard from the parents of eating disorder survivors and victims. We heard from recovered alcoholics and from the loved ones of those who had simply drunk themselves to death. And we heard from our intelligence agencies about the role of social media in amplifying and enabling foreign interference and social discord

Heartbreakingly, we heard from the loved ones of those young Australians who took their own lives as a result of cyberbullying. We heard from Ali Halkic, whose son Allem tragically took his life after relentless cyberbullying by an adult perpetrator. Mr Halkic felt that, had he known more about the online danger his son was facing, he would have behaved differently in how he allowed his son access to a phone and to social media.

We heard from brave mum Emma Mason, whose daughter Tilly Rosewarne took her own life at just 15 years of age, following a relentless campaign of bullying which catastrophically escalated on social media. I spoke about Tilly during the course of the inquiry into social media, highlighting her case as an example of avoidable harm.

We've all heard the story of Dolly Everett. After a chilling and tormenting campaign of cyberbullying and physical violence, this promising 14-year-old student, who had the world at her feet, tragically took her own life. Dolly's brave parents, Tick and Kate, have worked hard through the Do it for Dolly initiative to raise awareness around cyberbullying.

After Dolly's death, I convened a meeting of the Digital Industry Group, or DIGI, the peak body for these online platforms. I left that meeting feeling like I'd just met with big tobacco last century. The platforms have consistently refused to acknowledge their role in the harm perpetrated against young Australians. I promised then that I would be a thorn in their side. Since then, I've pushed for age assurance, restrictions, transparency and liability for these big tech companies. Australians can depend on one thing, and that is that the Leader of the Opposition and this opposition's shadow communications minister, I and the member for Flinders and all members on this side of the House will hold big tech to account.

The question many have asked is: will age assurance fix these problems? Australians should be under no illusion: there is no silver bullet; there is no panacea. Keeping kids safe online will require a multipronged approach. However, we know that age restrictions and age assurance will slow down and deter some users who would otherwise have a free pass to inappropriate content and contact.

There are elected parliamentarians, corporate shills and misinformed public spokespeople who continue to spout untruths about this issue. Let me remind Australians: just because you see something on social media—even from a trusted source—doesn't mean it's true. You have been misled by vested interests at the big end of town. Age assurance is not identity assurance. It is not new. It is a tool for parents; it is not a substitute for parenting. When you prove your age to purchase alcohol, access a discount or enter an adult store, your data is not stored. You are age-verified, not ID-verified.

I want to send a huge shout-out to the member for Banks, the shadow opposition minister for communications, who secured an incredibly important amendment to this bill which means that no government can, under this bill—and nor can any social media platform, under law—require a user of a social media platform to provide digital ID or provide government ID in the form of a passport or a drivers licence. That will be expressly prohibited under this bill, and the member for Banks should receive a huge bouquet for securing that amendment to this bill. In her statement to the US Congress, Facebook whistleblower Frances Haugen said that social media companies want you to 'believe that the problems we're talking about are unsolvable'. She said:

They want you to believe in false choices. They want you to believe you must choose between connecting with those you love online and your personal privacy.

The age-assurance trial should examine all options available to address privacy concerns and protect kids online. It is a balance which can be struck without Labor's dangerous digital ID proposals. The member for Banks, who has just walked into the chamber, has done just that. He's secured that amendment, and he should be applauded.

The degree of self-interest from social media platforms that want us to think this is all too hard, the degree of self-interest from mental health groups that have been receiving money from social media platforms and that came into the social media inquiry, saying, 'It's all too hard; there's nothing to see here' is unbelievable. It's unbelievable that mental health groups who purport to look after the welfare of Australians could be bought with 50 pieces of silver. It's an absolute disgrace. Shame on you.

Frances Haugen also said:

When we realized tobacco companies were hiding the harms it caused, the government took action. When we figured out cars were safer with seat belts, the government took action.

We know the harm caused by social media companies as a facility for foreign interference, as a platform for predators and as an amplifier for extremism and psychological distress. It is time for government to intervene, because industry has proven to be complicit, complicit in perpetrating and perpetuating harm upon our most vulnerable. As Collective Shout put so well:

We cannot allow large-scale reform to be scuttled by disagreements about the technical aspects.

I congratulate the shadow minister and I congratulate the Leader of the Opposition for their leadership on this issue. And I commend the bill to the House.

Comments

No comments