House debates

Tuesday, 26 November 2024

Bills

Online Safety Amendment (Social Media Minimum Age) Bill 2024; Second Reading

6:36 pm

Photo of Stephen BatesStephen Bates (Brisbane, Australian Greens) Share this | | Hansard source

Prohibition doesn't work. Prohibition is not how you make social media platforms safer or empower young Australians to navigate the online world in a safe way. The Online Safety Amendment (Social Media Minimum Age) Bill is not going to resolve the root causes of online harm. It's simply not going to work. All of us want to see more accountability for social media platforms and greater transparency for users, but this bill provides neither.

Let's not forget how rushed the process has been. The government allowed for a consultation period on the bill for 24 hours. The Senate hearing for the bill was only three hours, and young people were essentially locked out from having their voices heard because of this process. We absolutely need to protect young people online, especially against the targeting, harvesting and selling of their data. A blanket ban, though, is not the way to do it.

The recent parliamentary inquiry into social media heard over and over that an age ban will not make social media safer for anyone. It's complicated to implement, and it will have unintended consequences for young people. Yet here we are again, with the government and the coalition teaming up, ignoring the evidence and ramming this bill through in the final sitting week of the year.

Why won't it work? This bill inserts a new definition of 'age-restricted social media platforms' which effectively puts a blanket ban on all social media platforms. The government has indicated that some platforms will be exempt, but we don't have the detail before us that we need to properly scrutinise this legislation. Leaving exemptions to regulations puts a lot of trust in the follow-through, and I'm not confident that it's going to be done effectively. The government themselves acknowledge this is not a perfect system. The Prime Minister has said:

We know some kids will find workarounds, but we're sending a message to social media companies to clean up their act.

The reality is that many kids, not just some, will find a way to access these platforms anyway, and social media companies don't actually have to change anything about how they apply their algorithms as a result of this bill.

The bill also doesn't prescribe how platforms will monitor and enforce an age ban. We expect this will require some sort of age assurance, not just for young people but for everyone. The government already announced funding for an age-assurance trial in the budget but has only just awarded the tender. Essentially, the trial on a fundamental aspect of this bill has not even started yet. The idea of sharing important data with these platforms whose profit model is built on selling data profiles to advertisers is a very popular idea among not just young people but across my entire community!

Let's not pretend that young people aren't going to find a way to access these platforms. They are smart; they will figure it out. I actually think of a time when I was 13, and I would come home from school and spend too much time playing on my computer; I wouldn't finish my homework. My dad installed a timing program on my computer so it would switch off after an hour, and I couldn't play The Sims for too long. It took me 10 minutes to figure out how to get around that. The answer there was to work with me to do my homework, not just ban the game that I was using.

A blanket ban on social media also runs the risk of driving young people into even less safe online spaces. When that happens, it becomes impossible to know what is going on and how to help those who need help when something goes wrong.

Many young people can have positive online experiences as well. Young people get a lot of good information and valuable social networks and support from social media, especially isolated young people or young people from marginalised backgrounds who may rely on it for their social contact. Why punish young people for the inaction of publishers and tech companies? This ban will create an Australia that says young people can go to prison at the age of 10 but can't go on Instagram until they are 16.

What should we be doing? Experts are actively calling for alternative solutions to address the root causes of the problems of online safety, regulating the platforms themselves. The government has committed to legislating a duty of care, but that is not part of this legislation. If the government can rush these laws through, why can't they implement the duty of care they promised or take measures that actually make platforms safer for everyone, like banning platforms from collecting, selling and exploiting young peoples' data?

The EU has the Digital Services Act which has a number of protections for young people, including banning the use of profiling for ads on children and implementing obligations for children's safety that providers must abide by. There are practical solutions here that we should be looking at, such as a ban of the targeting, harvesting and selling of young peoples' data, a digital duty of care platform, EU-style guardrails to limit the toxicity of algorithms and extreme content, the ability for users to turn down and opt out of unwanted content, the full release of the online safety act review and investment in education for young people and their families to help develop digital literacy and online safety skills and equip them with the tools and resources they need for positive and responsible online use. These actions would help tackle the root issues of social media. This is what we should be doing, not legislating a ban on access with the apparent expectation that everything is magically okay online the second you turn 16.

An age ban alone will not make the platforms safer or age appropriate, nor will it change the culture that informs unsafe behaviours that people are targeted with on these platforms. Instead of banning young people altogether, we need to tackle the predatory business models of the tech giants. That includes the algorithms that fuel extremism and mental health issues. The government's own online safety expert, the eSafety Commissioner, has recommended a multipronged approach that encourages platforms to be safe by design. Australia's Human Rights Commissioner has voiced concerns about these restrictions in the legislation and the rushed time frame of this legislation as well. If the government wants to protect the safety of young people, they should be looking to stop platforms harvesting young peoples' data and targeting them with algorithms and advertising to make massive profits. All users should have the ability to switch off, reset or turn down the algorithms that push unwanted content into their feed.

Privacy reforms are long overdue. The protection of users' data is vital to keeping people safer on- and offline. There is growing concern about the unabated use of users' data by tech companies to train their AIs without consent, knowledge or compensation. In the EU, the likes of Meta have been forced to provide an opt-out option for users, at a minimum. Australia must force companies to do the same here, because prohibition doesn't work; it hasn't before, and it won't now or into the future. If we are serious about addressing the issues that have arisen because of social media, then we must tackle them at the source. Guardrails, digital duties of care, stopping the targeting and harvesting of young people's data—all of these are far better approaches than a blanket ban. Young people will find ways around a ban, and then what? We may create a situation where young people are driven into even worse online spaces, and that is something that nobody wants.

As the youngest person in this chamber and one of very, very few people in this place who grew up with this technology and with social media, I can say that change is needed but this bill is not it. It shows a fundamental misunderstanding of how the internet works and how young people engage with the internet. So my message to the government is this: bin this bill, talk to young people and come back. Young people are painfully aware of how algorithms work and how they impact them and their social circles. Listen to young people. Listen to the experts. You would come back with a much better bill.

6:46 pm

Photo of Elizabeth Watson-BrownElizabeth Watson-Brown (Ryan, Australian Greens) Share this | | Hansard source

I know we all worry about what kids are exposed to on the internet. Extreme content is everywhere, online bullying is worsening and kids can gamble with the click of a button. I get it; I'm a grandparent. I want to protect my grandchildren too, and I understand why so many people would think a social media ban would be a good idea. We all want to protect our kids, to shield them from harm and to keep them safe from the dangers of the world for as long as we possibly can. It's natural. The internet is a wild and often unregulated place. We feel we have very little control of what our children see or do online.

But the unfortunate reality of this flawed bill—the Online Safety Amendment (Social Media Minimum Age) Bill 2024—is that it won't actually do that. It won't protect our kids. It likely won't even stop them from using these platforms or seeing the harmful content we so desperately want to shield them from. What it will do, dangerously, is create the illusion of action while not addressing the real dangers of the online world. I know many parents want something—anything—to be done, but this bill is just not the solution. This bill shifts the blame. Instead of holding billion-dollar tech giants accountable for their harmful practices, it targets kids and families. It says to parents, 'You figure it out,' while these corporations keep profiting off the harm that they do. Banning young people from these platforms doesn't stop them from accessing harmful content, it doesn't stop tech giants from exploiting their data and pushing toxic algorithms, and it certainly doesn't fix the root problem.

The government claims this legislation is world-leading. Yes—but, embarrassingly, what it is leading the world in is poorly thought-through legislation. It is leading the world in legislation uninformed by evidence. It is leading the world in legislation so riddled with privacy and safety risks that it'll likely do more harm than good. The minister herself seemed underwhelmed by her own legislation when trying to defend it in a response to a question in question time this week.

The joint select committee inquiry into social media heard months of evidence from experts, parents, young people and organisations, and their overwhelming majority recommendation was not to impose an age ban. Other countries have tried similar measures, and they've failed abysmally. South Korea, for instance, rolled back their shutdown law in 2021 because it simply did not work. The government's response? Ignore the experts and the months of evidence, and instead rely on a 16-page evaluation report by the Office of Impact Analysis—a report widely criticised as thin on evidence. It even cites a study that doesn't support banning children under 16. In fact, the study's co-author has publicly stated that the government has misunderstood the findings, which is completely unsurprising from a government that has made a name for itself comprehensively misunderstanding the real needs of everyday Australians, hanging them out to dry while corporations are continuously let off the hook. This proposed 'fix' to online harms for young people is just more of the same.

I have heard from many Ryan parents who, of course, want their kids to be safe online; we all do. But they know, as do the experts, that banning kids from social media is not the answer. Kids are smart. They're smarter than most of us in this place when it comes to navigating social media. They will find ways around these bans.

One of the most glaring issues with this bill is privacy. While the legislation prohibits platforms from using data collected for age verification for other purposes, it doesn't change the fact that it requires young people and everyone else to share sensitive information with platforms that have a long history—a bad history—of exploiting user data for their own profit. Really, who trusts social media companies with their data? Few do, and that's for very good reason. Platforms like Facebook have repeatedly shown that they will prioritise profit over privacy, often at the expense of their most vulnerable users. Yet Labor proposes we hand them more of our personal information. It's insanity!

Prohibition isn't safe. Prohibition is a bandaid that fails to address the fundamental problem, which is the toxic business models of tech giants. These platforms profit from harmful algorithms that push extreme content, exploit personal data and target young people with predatory advertising. It's their business model. Experts have made it clear that these are the root causes of online harm.

The government has promised exemptions for services like headspace, Kids Helpline and Google Classroom, but these exemptions rely on future regulations; they're not even in this bill. If those regulations aren't passed in time or if a new government chooses not to follow through, kids could lose access to online spaces designed to actually help them and educate them.

The loopholes in this bill are so huge you could drive a truck through them. Kids will always find workarounds; the Prime Minister has already admitted as much. So what are we really achieving here? We're not protecting kids. We're driving them towards less regulated, more dangerous corners of the internet.

What about the unintended consequences for young people's mental health and wellbeing? Cutting them off from social media doesn't eliminate the challenges they face. It isolates them from their peers, their communities and vital support networks. Online spaces can be dangerous, but they can also be lifelines. Rather than banning young people from these platforms, we should be empowering them with the tools to navigate these spaces safely and confidently.

We need education. The government should fund programs that teach digital literacy and online safety skills to young people and their families. Prohibition doesn't teach kids how to be safe online. It just sort of pushes the problem out of sight. Social media platforms have become breeding grounds for harmful content, predatory advertising and unsafe practices. This flawed bill doesn't solve those problems. It's being rushed through, and its consequences will be far-reaching and deeply harmful not just for young people but for everyone who uses the internet.

6:53 pm

Photo of Zali SteggallZali Steggall (Warringah, Independent) Share this | | Hansard source

I rise to speak on the government's Online Safety Amendment (Social Media Minimum Age) Bill 2024. While I accept that there is a goal of dealing with online harms and protecting young people, I think this is incredibly misguided. It is a window-dressing attempt by the government to look like it is doing something about harm without actually doing anything that is going to be meaningful. I am concerned this legislation will backfire by driving underground the harms of social media, ultimately not preventing how young people access social media; they will find a way. What it will do is fail to hold social media companies accountable for moderating harmful content, which will still proliferate on these platforms.

The potential dangers of social media are undeniable. We're all concerned. I think we've all, in this place, been exposed to the darkness and the black hole of negativity that can exist there, and the anonymity of people dealing through social media, thinking that it's a platform on which they can speak and behave in the most atrocious way. But that doesn't get solved by just banning young people under 16 from these platforms. That gets solved by imposing a duty of care on these platforms, making them responsible for harmful content. That is where the government should be stepping up to the plate and actually doing what may be hard but is necessary to be done if you want to be a government that is taken seriously on protecting young people and all users from harm.

There's no doubt we need to get the balance right. Of course, protecting children is so important. But let's get real about the average age in this place. We are in a whole different age category. Young people have grown up online. For many, social media is not just a pastime; it's an important communication tool that connects them with peers and the wider community. It's been really interesting to listen to some of the justifications by members of government—and the Prime Minister has done it—talking about how they want to see kids back on the footy field and playing and all of those things. Of course, we want them to be active. But the suggestion that somehow banning social media is going to immediately result in this uptick in physical activity or social interaction without additional measures or any real research in how that's even going to happen is just taking Australians for fools, I think, in how this legislation has been drafted and presented and the time in which it has been put forward.

We also know that young people find their communities through social media. It's been fascinating for me as a mum of young adults now watching them being online and being concerned about the harms and influences but also then observing the benefits, seeing how through certain mediums they have had the opportunity to find their peers, find their tribe and feel connected. They are all the things that are at risk of being lost. Young people don't just access services through what the minister is indicating will be some exceptions. They actually also access information and services through their peers and friends relaying their experiences. That will all be lost.

I don't dispute there is a dark side to the internet and social media that urgently needs to be addressed. But we also need to be really clear. Young people tell me about the positive ways they engage on these platforms. They build their communities, they explore creativity, they start businesses, they learn, they get an opportunity to see beyond the boundaries of just the national issues or legacy media's take on events and they can connect with young people all over the world. This is something that has been unique to their generation. We know it also allows them to stay connected with family and friends. As a grandparent, how do you stay connected with a young person? Actually, it's through social media. It gives you a window into their world that you don't have in any other way.

Sadly, the government is not respecting the voices of young people here. I think it's telling to see just how little that's really being reflected in the government speeches. Instead, it is people in this place imposing their will on young people. Rather than representing them, they are imposing their will on them. It's a blunt policy tool that ultimately punishes young people while letting perpetrators of harmful content and the platform that hosts it completely off the hook.

The bill seeks to restrict users under the age of 16 from having a social media account by relying on social media companies to take reasonable steps for the age assurance of their users. In all my queries to the minister and the government as to what those reasonable steps are and how this age-assurance process will work, they've said they don't know. This is a complete unknown. This legislation is completely putting the cart before the horse. The government has no idea how this is going to work and yet it is taking up valuable time.

The bill's intervention is well meaning, as I said, and acknowledges the very real harm of online platforms. I do have concerns about that. But the effectiveness of this bill is highly in doubt and there are serious concerns about unintended consequences from this bill. I move:

That all words after "not" be omitted with a view to substituting the following words:

"desire to give the bill a second reading and notes:

(1) the Age Assurance Technology Trial, conducted by the Department of Infrastructure, Transport, Regional Development, Communications, and the Arts, has not yet been finalised and further debate on the bill should only occur when a comprehensive report has been presented to Parliament with the results of the trial;

(2) that the government is wasting the Parliament's time on a bill where the implementation is unclear and the effectiveness remains disputed by experts; and

(3) that the government is being selective on the harms they wish to protect young people from by prioritising the valuable resources and time of Parliament to pass a Bill that is not coming into effect until 2026, instead of introducing and progressing legislation that would restrict gambling advertising, despite having committed to do so, and having being unanimously recommended by a multipartisan House of Representatives committee and having overwhelming public support".

This bill should not be rushed through this parliament the way it has been. We should have a comprehensive understanding of how the measures can be implemented and the unintended consequences of the legislation. We know that the government has just awarded a tender for the age-assurance trial which will inform advice to the government and eSafety Commissioner on the implementation and enforcement of the bans. I ask the government to not call a vote on the second reading until the final report with the results of the age-assurance trial has been presented to parliament, so that parents and users can be informed.

The government is being selective on the harms that it wishes to protect young people from, and it is prioritising a blunt social-media ban over protecting young people from the harm of many real things, such as gambling advertising and the algorithm and how it works. Gambling advertising poses a significant concern for young people, and there are well-documented links to increased gambling participation. So the government is absolutely being selective about which harm it is choosing to prioritise, and I don't think it's a coincidence that it was when the government was under the pump about introducing a ban on gambling advertising that, all of a sudden, we got the posturing and the big announcement around an online age ban. The ban on gambling advertising was unanimously recommended by a multipartisan committee following an in-depth inquiry into the harms of gambling. The government has accepted that recommendation, and, yet, to date, has refused to take action.

On top of this, we know that polling shows that over 70 per cent of Australians want a ban on gambling advertising. If you want to win the favour of voters, act on banning gambling advertising. The message couldn't be any clearer. Instead, the government is focusing on a minimum-age bill, which was originally drafted, as I said, just in September. It's a distraction from their promise to ban gambling advertising. It ultimately won't be implemented until 2026, so why is this being rushed through without proper scrutiny, with only a couple of days in this parliament? This is bad government and bad governance.

When the bill was introduced last week, it was the first time we were able to read and understand the details of the proposed legislation. Now, in the last week of parliament, the government's expediting this bill and limiting the availability of parliamentary scrutiny and oversight of its effects.

We have serious concerns around privacy consequences. Ultimately, this bill will impact all users. It's not just the under-16s that will be banned. All users of social media will, to some extent, have to be assessed by those platforms as to whether or not they fall foul of it. What we know is that this bill requires much more scrutiny.

Last week, the bill was sent to the Senate Environment and Communications Legislation Committee. The committee accepted submissions for only 24 hours. If that is not a joke when it comes to public consultation, I don't know what more can be said. In that time, the inquiry was inundated with over 15,000 submissions. Clearly the public feels strongly about this issue—even more reason the bill should be subjected to proper scrutiny. Only yesterday morning were the committee able to hear from the community on the wide-ranging impacts of the bill. The committee will publish their recommendations today, but here we are debating the bill and shortly we'll be asked to vote on it. This is all only five days after the bill was sent to committee.

The government has even acknowledged that this is a short-term approach to protect young people whilst they continue to work on longer-term solutions for protecting people from harms online. The member for Goldstein has already introduced to the parliament her private member's bill that addresses much of those harms. If the government is not capable of coming up with proper legislation, then let's adopt the work that the member for Goldstein has done and get on with the real protection of people online through a duty of care.

There are no details from the government about any of their plans to legislate further protections. It's, 'Trust us; we've got this. We'll get to this.' But all we've got is a bandaid solution of just imposing a ban on under-16s.

If the government were serious about safeguarding young people from harms online, they could have introduced a temporary ban or fast-tracked work to implement a statutory duty of care for online platforms. Start dealing with bots online. Start dealing with anonymous accounts. Force some accountability and some monitoring of harm. Do something about the algorithm.

So I ask the government why it's prioritising this legislation, rushing it through parliament, knowing that it won't be implemented for a time and it doesn't even have a proper process. In contrast, gambling advertising is something that needs to be done.

When we talk about harm from online platforms, we know it's ever present, and certain groups, such as children and young teenagers, are at greater risk of this harm. Young people more than ever are feeling isolated and excluded, and some teenagers who use social media platforms have regularly reported finding lower life satisfaction than those who don't use online platforms. But we also know that others who feel isolated find their tribe and find great comfort in social media. We know that online platforms enable young people to endure cyberbullying, hate speech and online abuse, as well as harmful content.

All these potential harms are not to be understated, and they create a real and ongoing threat to their mental health. I agree that online platforms must be made to be responsible and to keep everyone safe on their platforms. They should be forced to promote respectful, honest, authentic content, a responsibility that these platforms often fail to fulfil. But nothing about this proposal does this. A blanket ban on social media is not the answer. In fact, the Butterfly Foundation, who represent Australians impacted by eating disorders and body image issues, have stated that a blanket ban takes the onus off platforms to do better and that it could negatively impact people looking for help who are socially and physically isolated. In restricting access for those under 16, the government has neglected the opportunity to regulate platforms to materially improve online safety for everyone. So I really ask the government to do better. Surely you can do better than this—than trying to just grab a headline and look like you're doing something about online, when really you're not.

I appreciate that for young parents this is incredibly distressing, and it's easy to think that banning it will be the solution. The reality is that it's just going to push it underground. It's still going to happen. I have a modern family of five. They're all adults now, but I know how tech savvy they are and what they do. For me, what is needed is legislation that would allow them to reset their default settings and that would make sure that the algorithms are much safer and that there is a responsibility and a duty of care. These are all things that would make a difference.

Of course, we also need greater education. Young people, and all users, need to understand how to be safer and more responsible digital citizens. This is an important piece of the puzzle to keep young people safe, but the government is not focusing on that—on what could actually be done.

I think of where kids have found their tribe online—for example, School Strike 4 Climate. Young people's lives have been online, and social media has played an important role in amplifying young people's voices when they might otherwise have been excluded, with their voices not being reflected in mainstream media. For example, Lucy Flynn is a 14-year-old from my electorate who, after recovering from an eating disorder, decided to make her own petition to call on the government to provide more public hospital beds for treatment of those with eating disorders. She harnessed the power of social media to tell her story and amplify the issues in eating disorder funding. So online platforms can provide a platform to reach people outside their geographical communities, provide an avenue for youth leadership on issues that matter to them, and provide the ability for young people to feel that they have control over their future. The government don't even want to be saddled with a duty of care to young people when it comes to the environment, and now they want to take away their capacity to mobilise and to have a voice when it comes to climate issues. School Strike 4 Climate is a classic example where thousands of young people came together to march on the environment minister's office and demand real and substantive climate action. This provided an invaluable opportunity for young people to have a voice and be heard in a way— (Time expired)

Photo of Ian GoodenoughIan Goodenough (Moore, Liberal Party) Share this | | Hansard source

Is the amendment seconded?

Photo of Andrew WilkieAndrew Wilkie (Clark, Independent) Share this | | Hansard source

I second the member for Warringah's amendment and reserve my right to speak.

7:09 pm

Photo of Zoe DanielZoe Daniel (Goldstein, Independent) Share this | | Hansard source

This legislation, the Online Safety Amendment (Social Media Minimum Age) Bill 2024, is a bandaid to fix a wicked problem. It's neither the systemic nor the structural reform the Australian public needs.

I can well understand why parents like it, and I especially acknowledge those parents who've lost children, whose loss is attributed to social media harm—including those who testified, bravely and powerfully, to the Joint Select Committee on Social Media and Australian Society, which I was a member of. I also acknowledge Robb Evans, father of Liv, who I've spoken of before in this chamber. Liv took her own life due to an eating disorder, and social media was a contributor.

I convened a working group on eating disorders and social media, which, in many ways, has been the genesis of my policy work in this space and I think has also captured the government's attention in a positive way. I absolutely acknowledge the connection between various harms and social media. So I don't deny that something needs to be done.

The platforms have powerful underlying systems that actively and knowingly influence kids' social lives, play kingmaker in online public debates and run a data free-for-all in the name of advertising. And it's self-evident that these platforms are motivated by profit, not community benefit or safety.

However, my issue is that this legislation will make zero difference to the harms that are inherent in social media. It will not force the platforms to manage their algorithms, it will not force the platforms to identify and manage risk, and nor will it force transparency.

So what to do? Social media gives us many great rewards: connection; information; business opportunities. It opens up the world. For young people, it's a social space—the 2020s equivalent of sitting on the landline phone on the hallway floor for hours, as I did in the 1980s. It's a source of mental health support for many, and University of Canberra research shows that it's where the vast majority of young people get their news and current affairs.

But the sheer scale of the risk it concurrently poses to individuals and our society demands a regulatory response that is proportionate in urgency and in ambition. This is particularly the case given the power of the social media proprietors, who've operated and grown their platforms largely uncontrolled until now. Additionally, in Australia they operate under a toothless self-regulation model, and they place the bulk of their profits and management offshore, out of reach of Australian tax and legal systems.

As a member of the Joint Select Committee on Social Media and Australian Society, it's fair to say that I've been thinking about all of this a lot of late. The committee had a wide remit but took substantial evidence regarding age assurance. Certainly there were opinions that it was a good idea, but there was no substantive, evidence based, expert testimony that stacked up if, or how, it would actually work.

One of the recommendations the committee did agree to in its final report was that the government should report the results of its age-assurance trial to the parliament. The trial is only just underway, and it won't be complete for at least six to 12 months. Meanwhile, we're debating—and presumably the parliament will be passing—this legislation blindly, because the idea is somewhat popular. Worse, it potentially allows big tech to tick a box, without actually doing much at all. And what that even is, is unknown.

The bill, which was introduced just last week and sent to committee for one day, despite more than 15,000 submissions, contains no provisions on how the platforms are to keep the kids out. They're expected to take 'reasonable steps'. What those are will be worked out by the platforms, with regulatory input from the minister later. It's backwards.

In many ways, this is policy that is also looking back to a bygone era, when we all played street cricket until dusk, instead of looking at our phones, and found out where our friends were by riding around the neighbourhood until we found the house where all the Malvern Stars were lying on the front lawn. I get it. We all want that life back for our kids.

But our societies have become digitised over the last 2½ decades. The entrenchment of digital communications platforms and technologies has been gradual, but it is enduring. We cannot look back to the technological challenges and solutions of last century and apply them to our contemporary context, as the Online Safety Act does and age-gating social media would. This mindset cannot continue to guide contemporary Australian policymaking. It is time to think forward, not back.

In my engagement with the Goldstein community on this legislation, the feedback has been varied. I've heard from some parents and, as a parent, I understand the uncertainty and feeling of lack of control over what our children can be exposed to online. These parents observe their children taking to the intentionally addictive qualities of social media and digital platforms, and I deeply understand the desire to solve this problem. I, like these Goldstein parents, wish that this legislation was the silver bullet we wish for to comprehensively protect our kids from online harm. If it was, I would be its strongest advocate. But, unfortunately, this legislation is nowhere close to what's needed to meaningfully protect our kids in their online experiences. The true objective of the legislation is not to make social media safe by design but to make parents and voters feel like the government is doing something about it.

There's a reason why the government parades this legislation as world leading. Its because no other country wants to do it, and for good reason. Modern policy problems require modern solutions, and, obviously, solutions that will actually work. Of all the many pitfalls in this legislation, at the core of arguments put forward by those who oppose it is this: age gating is, quite simply, not the appropriate mechanism to use in this context. In fact, there is little evidence at all to suggest that an age based regime of this nature could be effective. Instead of protecting our children from social media, it may also expose them, and, possibly, Australia's adult population, to additional risk. The bill does not specify how digital platforms will be expected to verify an individual's age, leaving open the possibility that Australians of various ages may be forced to hand over sensitive personal information to keep their accounts.

I note the opposition amendments, apparently negotiated with the government, on limiting the requirements to hand over things like passport details, and this is a good step. But according to Digital Rights Watch, the information required might still range from birthdates all the way to face prints. I note that the legislation mandates the destruction of such information, but as so many Australians deeply understand, our nation doesn't have a strong track record of late in protecting the privacy of its citizens. I note the member for Mayo's consideration-in-detail amendment in relation to the provision of personal information, which I will support.

It would be remiss of me not to point out the stunning irony of the coalition's sudden reluctance to support this legislation following a tweet by one of big tech's oligarchs, Elon Musk. Misinformation spreading online sought to make the curious link between this legislation and the government's Digital ID Act as part of a conspiracy to shut down free speech on the internet. This bill—one that ostensibly claims to challenge the awesome and concentrated power of big tech—was close to the chopping block via a simple tweet sent from those who stand to be regulated by it. Go figure. Perhaps, though, this is more than simple irony. Perhaps this is testament to the disproportionate political and cultural power that big tech have been allowed to accumulate and the set of circumstances which have led us to this point—the point of debating legislation which even the Prime Minister himself describes as potentially unenforceable.

The power of the platforms and the importance of making them accountable leads me to what we could be doing—should be doing—instead: something that is actually meaningful and will stand the test of time. The government have lots of policy options to choose from here. One only needs to look overseas to see online safety regulatory regimes that are working. But—on brand for this government—once again, their chosen policy was just the lowest-hanging fruit: the path that is simple and political, not the one that will actually make social media safe for our children by design.

Yesterday I tabled, as a private member's bill, a five-pillar regime which amounts to a digital duty of care. If the government wants to make social media safe not just for children but for everybody, then let's take that to a vote right now. This model of legislation has broad support across the parliament and in Australian civil society. It is supported by organisations ranging from the Foundation for Social Health and the Foundation for Alcohol Research and Education to the Human Rights Law Centre and Reset Tech Australia, amongst others.

A fully implemented digital duty of care is what will make social media safe for all Australian kids and adults. Rigorous age verification technology may one day be a supplementary component of such a duty of care, but the age-gating model as proposed in this legislation alone will be starkly inadequate. This is why I move the second reading amendment circulated in my name:

That all words after "That" be omitted with a view to substituting the following words:

"whilst not declining to give the bill a second reading, the House:

(1) notes that:

(a) Australia was once an international world-leader in online safety regulation when the previous Coalition Government enacted the Online Safety Act 2021;

(b) Australia's existing content-based model of online safety regulation was inspired by what was once effective during the era of Broadcast television, newspapers, and radio last century, and that technology has fundamentally changed this information landscape;

(c) Australia has lost its status as a world-leader in online safety regulation following the enactment of ambitious 'systems'-based laws in the European Union and United Kingdom respectively; and

(d) whilst age assurance as a tool for regulators may have some capability to contribute to safer Australian online spaces, it does not change how algorithms operate in any fundamental way;

(2) recognises the promising recent public statements of the Prime Minister and the Minister for Communications expressing the Government's intent to implement a 'Digital Duty of Care' if elected to a second term of Government;

(3) notes that a Duty of Care:

(a) alone is insufficient to meaningfully hold digital platforms and their algorithms to account and make social media safe by design; and

(b) is just one of five necessary elements of equal importance if Australia is to meaningfully make digital spaces safe for Australians of all ages;

(4) calls on the Government to re-claim the mantle of our nation as a world-leader in online safety regulation by enacting a comprehensive 'systems-based legislative framework, this being:

(a) a Duty of Care;

(b) risk assessments;

(c) Risk Mitigation Plans;

(d) genuine transparency measures for the Australian public and our research community; and

(e) enforcement measures proportionate to the risk algorithms pose to Australian society".

Australia was once a world leader in online safety regulation, and we can achieve the government's aim to reclaim that mantle—just not with this bill. Only with safety by design, a duty of care, risk management and mitigation, and a solid incentive to comply will we make digital platforms safe for our kids, ourselves and our communities.

Online safety regulation cannot truly be safe unless it's systemic. It is the systems that must be made to change, not the people. We need to reshape our vision of what online safety looks like and follow the models that are achieving meaningful behavioural change, namely the Digital Services Act and Digital Markets Act in Europe, that impose a duty of care on the companies to do no harm. We won't achieve that by passing a bill that just makes us all feel better.

Photo of Ian GoodenoughIan Goodenough (Moore, Liberal Party) Share this | | Hansard source

Is the motion seconded?

Photo of Kate ChaneyKate Chaney (Curtin, Independent) Share this | | Hansard source

I second the motion and reserve my right to speak.

Photo of Ian GoodenoughIan Goodenough (Moore, Liberal Party) Share this | | Hansard source

The original question was that this bill be now read a second time. To this the honourable member for North Sydney moved as an amendment that all words after 'That' be omitted with a view to substituting other words. The honourable member for Mackellar has moved, as an amendment to that amendment, that all words after 'House' be omitted with a view to substituting other words. The honourable member for Goldstein has moved as an amendment to that amendment that all words after 'House' be omitted with a view to substituting other words. The question now is that the amendment moved by the honourable member for Goldstein be agreed to.

7:22 pm

Photo of Terry YoungTerry Young (Longman, Liberal National Party) Share this | | Hansard source

I rise to speak on the Online Safety Amendment (Social Media Minimum Age) Bill 2024 before us today. Let me say from the outset I fully support the intent of this legislation, which is to protect our young people from the harms of online bullying—a scourge not only in our society, but globally. I'm sick and tired of hearing of another teenager who has taken their life due to online bullying. I'm tired of watching grieving parents plead for action on this serious issue. It is a common belief amongst those of us who have the privilege of being elected to serve that our first duty is the protection of all Australians.

My issues with this legislation are, as I said, not around intent, but I do have a problem with the mechanics and the rushed manner in which it is being rammed through. I suspect that it's for vote-grabbing reasons. Why else would you rush legislation through that doesn't take effect, if passed, until 2026? Could it be that there's an election in the first half of next year?

The Leader of the Opposition first announced this as coalition policy over 12 months ago, saying we would implement it within 100 days after the election should we have the privilege of winning government. This government have latched onto the Leader of the Opposition's initiative, but, as usual, they have done the headline vibe thing but haven't thought through the process. In contrast, the coalition have been methodically working through the many inevitable questions that arise from proposed legislation like this so that, should we win the election, we'll have a sensible, well thought out methodology that addresses all aspects of the bill, including keeping children safe as well as parents, social media and device hardware supply roles in the initiative while also considering protecting Australian citizens' privacy.

I was pleased to see that the government agreed to the coalition's amendment where social media companies cannot force people to upload their digital ID, driver's licence, passport or other government issued identification to verify their age. This is definitely a step in the right direction. However, I wasn't pleased to discover that some of the proposed methodology to age-verify users relies on social media companies effectively using data they accumulate on us by using algorithms and the like, such as determining that someone is under 16 from their posts for something as simple as having a 13th birthday party. The retort to this, of course, is that they already do it. By the way, I hate the fact that they collect all this data on us. But knowing about, condoning and even being complicit in encouraging their use of AI to monitor and collect data on us is something I simply cannot in good conscience support.

There is also talk of using facial recognition software for age identification. I know about this, but the thought of these companies capturing images with goodness knows what in the background doesn't inspire me with great confidence either. The other issue is that these social media companies can—and, I suggest, will—simply change algorithms so they begin to miss people's ages, as, make no mistake, they will lose revenue and profits if this legislation passes through, as they will enjoy a smaller market and customer base.

We haven't even spoken about the parents' roles in this. Why can't we legislate that the manufacturers of devices of anyone under 16 have to be tethered to their parents' device, through their Apple ID, family sharing, iCloud or other forms, so that the parents have the final say on what apps their kids can use? This also removes financial incentives for social media platforms, who, from what I can see, don't currently make the hardware that people use.

No solution is a silver bullet, and people always find ways around laws when they really want to. But that's no reason governments should not pass laws intended to protect people. Otherwise, we might as well have no age restrictions on alcohol, tobacco, gambling or movies. As we know, most Australians are law-abiding citizens and will comply with the laws of the day.

Having a legislated age restriction also gives parents the added ammunition to say to their kids when they inevitably ask, nag or beg to have a social media account: 'Sorry, it's against the law.' Sure, there are those who would say, 'That is weak parenting,' but not all families with children are the same for myriad reasons, and governments must try to consider all family dynamics and personalities when they are legislating, knowing full well that we will never be able to appease everyone or receive everyone's approval. But we must attempt to address the concerns and needs of as much of the population as possible.

As a father of four and a grandfather of five—I've got some boys, some girls, some under 16 and some over 16—I've seen firsthand the trauma and harm that social media does to our youth. My personal observation has been that, generally, once people reach the age of 16, they have the emotional stability and maturity to make better decisions about what they put on social media and what they read on social media. There are exceptions both ways, of course, but this age seems pretty right for most.

As I said at the beginning of my contribution, I absolutely support the intent of this bill in protecting and in some cases saving the lives of our young people, but I have reservations due to the rushed manner and lack of detail and methodology that is currently proposed in enforcing the legislation should it become law. I believe that the lives and mental health of our youth outweigh any possible implications around privacy, but I would much rather see this legislation be postponed until a full and proper process has been able to be completed, when we can actually see the detail of how the ID process will be enforced. It is far too important to be rammed through willy-nilly to try and win a few votes.

7:28 pm

Photo of Keith PittKeith Pitt (Hinkler, National Party) Share this | | Hansard source

The Online Safety Amendment (Social Media Minimum Age) Bill 2024 is a piece of potential legislation on which I find myself torn in terms of taking a position. For those who know me, that is pretty unusual! I can usually form a view pretty quickly and make a decision. I want to give you an example of some of the correspondence, calls and emails that my office has been getting, and this is only a very small selection.

'Don't ban kids from the ocean; teach them to swim.' 'What is the rush?' 'This is a backdoor to the Digital ID Bill.' 'Will it stop the bullying?' Several callers have said they will close all digital accounts if they're asked to prove their identity. 'Parental rights should not be taken away. This is a decision for the parent, not the government.' 'It will force everyone onto dodgy apps that don't have any moderation.' 'Making people create a digital footprint increases the information available to hackers.' 'Many workplaces, small businesses, sporting groups and community groups have social media pages to engage with their members, and these will be reduced if these bans come in.' 'What are the consequences for parents who allow their children on social media?' 'There is not enough information about this bill or how will this will work.' 'Will this put DV victims at risk if they are forced to have real-name accounts?'

This is a small sample of what is coming to my office and I suspect many others, because people are concerned, and I absolutely recognise their concerns. One of our roles as members of parliament is to put their concerns on the record in regards to legislation like this.