Senate debates

Monday, 18 November 2024

Committees

Social Media and Australian Society Joint Select Committee; Report

6:16 pm

Photo of Sarah Hanson-YoungSarah Hanson-Young (SA, Australian Greens) Share this | | Hansard source

On behalf of the Social Media and Australian Society Joint Select Committee, I present the committee's second interim and final reports, together with accompanying documents. I seek leave to move a motion in relation to those reports.

Leave granted.

I move:

That the Senate take note of the reports.

This inquiry was a very busy inquiry. We had a number of hearings. We heard from experts right across the breadth of the sector—from tech experts and tech companies. We heard from parents' groups. We heard from young people. We heard from mental health and other communications experts. We heard from the mainstream media representatives as well. Truth be told, the terms of reference of this special inquiry were very long and covered a large number of things. We looked at the efficiency and effectiveness of the News Media Bargaining Code. We looked at the impact that social media was having on young people. We looked at the issues of mis- and disinformation. We looked at how tech companies respond to their obligations or social responsibilities. We looked at the impacts of scams online, particularly how they impact older Australians. There was a lot packed into this inquiry.

In our final report, a lot of the attention was given to social media itself and how we make these platforms safer, less harmful and better for members of the public and users alike. I want to point out very clearly that the most interesting thing about this final report is what is not in the recommendations. What is not in the recommendations is a blunt age ban on social media. That's not what we recommended. That's not what the unanimous report from this inquiry recommended. What we did recommend was a whole raft of things that could be done to tighten the screws on the big tech companies to improve their accountability and to make sure they are legally liable and responsible here on Australian soil. They must deal with the insidious, horrible nature of their data harvesting, where they sell data off to advertisers and supercharge algorithms. We looked particularly at how that relates to young people. The report talks about the need for a levy or tax on these big tech giants. But not one recommendation in this final report says that a blunt ban on young people on social media is a good idea. Why? Because we listened to the experts, and the experts' view is that it's not. The experts' view is that this isn't how you do it.

We heard directly from parents. As a mother of a 17-year-old, I'm very well aware of the concern and anxiety that Australian parents have about how much time our kids are spending on social media, the lack of guardrails that are there and the peppering of such direct, insidious advertising that tells them what they want before they even realise that they want it. There's also the super-charging of algorithms, the suggested posts, the suggested people to follow, the sucking in of the doomscroll that happens and the rabbit warrens that we all, as users, sometimes succumb to. We know that young people are being targeted by social media companies and we need to do something about it, but the experts didn't say just kick them off. The experts said we need to force these big tech giants, who are making massive profits—billions and billions of dollars—to make platforms safer. We need to make them responsible. We need to hit these companies where it hurts, and that is their business model. That's the advertising dollar and how they use that and how they use these secret, hidden algorithms.

One of the recommendations in this report is for a duty of care on social media and digital platforms. If your business model is an online platform and you have people coming there, you have a duty of care that that space will be a safe place for users. That's obviously really important for young people and children, but it's actually also important for everyone. If we make a safe platform work for everyone, that is better for children as well. There is a current debate about whether the government is going to bring forward its social media ban into this parliament within the next two weeks. I just urge you all—this inquiry took hours and hours and hours of evidence, and the evidence is really clear. Yes, we need platforms to be better, yes, they need to be regulated and yes, we need guardrails. It's the Wild West out there, and we have to put in place some law.

But telling young people to get off YouTube, TikTok, Instagram or Snapchat isn't going to make young people safer and protect them. We know what young people are like. My 17-year-old knows how to use a VPN. They get around it. That's not my main concern and that's not what the experts' main concern was in this unanimous majority report. It's what happens when they turn 17. What happens if we haven't made these tech companies responsible for the environments in which they operate for everybody? What happens when your daughter turns 17 and is dropped in the deep end of the ocean with no help, no guardrails, no digital literacy and no recourse to the companies to do the right thing?

Of course mental health experts have said to us over and over again that, for particular groups of young people—those who are vulnerable, those who have mental health issues or concerns and those who feel isolated from their community and perhaps can't talk to their family about what is going on in their life—connecting online with other young people who are going through the same thing is really important, and that actually saves lives. That's an important way of ensuring they get access to information—help that otherwise wouldn't be available. We heard over and over again from the young people themselves that being able to access this information online helped them, saved them and allowed them to get the assistance that they need—or to know that a friend of theirs needed some assistance and then to go and get an adult involved and say, 'Look, my friend over here is struggling.' We've got to be better than just the kind of grab-worthy, kneejerk reactions of these blunt instruments.

I tell you what: we had these big tech giants in front of us and we drilled them. They have no qualms or care about the relationship that they have with their users. They are making big bucks. For Australian families and members of the community right across the board—and politicians hear this all the time—the social licence of these social media companies is evaporating quickly. Now is the time to strike. Now is the time for us to put in place rules that protect young people, protect all users, create a duty of care, create legal responsibilities and charge the tax that these companies should be paying, because they're sucking a lot of money out of our country and out of the pockets of families and they give very little back.

If we don't make sure that these companies have some type of onshoring of their business practices and operations, they will continue to circumvent Australian law. We heard that from the business experts. We heard that from the legal experts. We need both a duty of care and the legal foothold to hold these companies to account. We can't just beg them to do it. Do you think Elon Musk, if we ask him nicely, is going to do the right thing? No, he's not. He doesn't care two hoots. He's too busy making money and sucking up to his mate Donald Trump. He doesn't care. (Time expired)

6:27 pm

Photo of David ShoebridgeDavid Shoebridge (NSW, Australian Greens) Share this | | Hansard source

I too rise to briefly take note of the report. I really want to thank my colleague Senator Hanson-Young for her work in doing what she, the Greens and the Senate can do in holding these big platforms to account.

Let's be clear about what a social media age ban means. It's an age-monitoring plan for everyone, whether you're 13, 15, 40 or 60. It's likely to be achieved by monitoring keystrokes, scanning faces and watching your hand movements. It will result in systemic privacy invasion of everyone, and for what? The government can't even tell us what the harm that they're targeting here is. If the harm is excessive screen time—and we know that that's harmful—then why aren't games and Netflix considered? If the harm is bullying—and we know that can be harmful online as well—then why on earth are messaging apps exempt from it?

The truth is this entire policy is a thought bubble from a government that's unwilling to have the hard conversations about making these online places safe; regulating big tech so that kids are safe; and educating young people and their parents on how to manage risk online, how to think critically about online content and how to set sensible boundaries based on evidence. Labor and the coalition, it seems, have given up on any kind of push towards digital literacy. No doubt it's because groups like—I don't know; let's name one at random—News Corp and the Murdoch media empire are pushing for this attack.

The government say they're spoken to parents about it, but what about the kids and teens who are actually going to cop the ban? What's their say? They have been completely silenced by both the coalition and the Albanese government. In fact, social media is where young people are getting their news. It's where they're rallying for change and building communities. They need tools to interrogate what they see and hear, and they need platforms that are designed to be safe, not just some 21st century prohibition pretending that the internet doesn't exist.

Does the Prime Minister want to explain what his government's plan is for a 16-year-old who has been banned from social media when they're 13, 14 or 15 and then is just launched into this same, unregulated, toxic space when they become 16? What's the plan, Prime Minister? What's your plan? There is none. It's a thought bubble.

The Albanese government might not like what young people have to say and they may want to silence their voices, but the Greens say that young people actually have a right to be seen and heard, including in this debate. Young people have a right to engage in public spaces, and, increasingly, for young people, that's online public spaces, which should be safe. If the government really wanted to look out for young people, they wouldn't say, 'Go and get a VPN and engage in an unregulated social media site as though you're doing it from San Francisco.' What they would say is that they're here to look out for young people and that they'll hold social media giants accountable for online safety. They will insist upon transparency in the algorithms, prevent kids' data being tracked and prevent them being targeted with advertisements from tech platforms that just want to commercialise the data that they rip off kids. And they would make the space safe.

What if they really wanted to make kids safe? Of course they'd take action on climate change and they'd address the housing and the cost-of-living crises. If this government were serious about looking out for young people, they'd start by listening to them. I seek leave to continue my remarks later.

Leave granted; debate adjourned.