House debates

Tuesday, 26 November 2024

Bills

Online Safety Amendment (Social Media Minimum Age) Bill 2024; Second Reading

7:09 pm

Photo of Zoe DanielZoe Daniel (Goldstein, Independent) Share this | Hansard source

This legislation, the Online Safety Amendment (Social Media Minimum Age) Bill 2024, is a bandaid to fix a wicked problem. It's neither the systemic nor the structural reform the Australian public needs.

I can well understand why parents like it, and I especially acknowledge those parents who've lost children, whose loss is attributed to social media harm—including those who testified, bravely and powerfully, to the Joint Select Committee on Social Media and Australian Society, which I was a member of. I also acknowledge Robb Evans, father of Liv, who I've spoken of before in this chamber. Liv took her own life due to an eating disorder, and social media was a contributor.

I convened a working group on eating disorders and social media, which, in many ways, has been the genesis of my policy work in this space and I think has also captured the government's attention in a positive way. I absolutely acknowledge the connection between various harms and social media. So I don't deny that something needs to be done.

The platforms have powerful underlying systems that actively and knowingly influence kids' social lives, play kingmaker in online public debates and run a data free-for-all in the name of advertising. And it's self-evident that these platforms are motivated by profit, not community benefit or safety.

However, my issue is that this legislation will make zero difference to the harms that are inherent in social media. It will not force the platforms to manage their algorithms, it will not force the platforms to identify and manage risk, and nor will it force transparency.

So what to do? Social media gives us many great rewards: connection; information; business opportunities. It opens up the world. For young people, it's a social space—the 2020s equivalent of sitting on the landline phone on the hallway floor for hours, as I did in the 1980s. It's a source of mental health support for many, and University of Canberra research shows that it's where the vast majority of young people get their news and current affairs.

But the sheer scale of the risk it concurrently poses to individuals and our society demands a regulatory response that is proportionate in urgency and in ambition. This is particularly the case given the power of the social media proprietors, who've operated and grown their platforms largely uncontrolled until now. Additionally, in Australia they operate under a toothless self-regulation model, and they place the bulk of their profits and management offshore, out of reach of Australian tax and legal systems.

As a member of the Joint Select Committee on Social Media and Australian Society, it's fair to say that I've been thinking about all of this a lot of late. The committee had a wide remit but took substantial evidence regarding age assurance. Certainly there were opinions that it was a good idea, but there was no substantive, evidence based, expert testimony that stacked up if, or how, it would actually work.

One of the recommendations the committee did agree to in its final report was that the government should report the results of its age-assurance trial to the parliament. The trial is only just underway, and it won't be complete for at least six to 12 months. Meanwhile, we're debating—and presumably the parliament will be passing—this legislation blindly, because the idea is somewhat popular. Worse, it potentially allows big tech to tick a box, without actually doing much at all. And what that even is, is unknown.

The bill, which was introduced just last week and sent to committee for one day, despite more than 15,000 submissions, contains no provisions on how the platforms are to keep the kids out. They're expected to take 'reasonable steps'. What those are will be worked out by the platforms, with regulatory input from the minister later. It's backwards.

In many ways, this is policy that is also looking back to a bygone era, when we all played street cricket until dusk, instead of looking at our phones, and found out where our friends were by riding around the neighbourhood until we found the house where all the Malvern Stars were lying on the front lawn. I get it. We all want that life back for our kids.

But our societies have become digitised over the last 2½ decades. The entrenchment of digital communications platforms and technologies has been gradual, but it is enduring. We cannot look back to the technological challenges and solutions of last century and apply them to our contemporary context, as the Online Safety Act does and age-gating social media would. This mindset cannot continue to guide contemporary Australian policymaking. It is time to think forward, not back.

In my engagement with the Goldstein community on this legislation, the feedback has been varied. I've heard from some parents and, as a parent, I understand the uncertainty and feeling of lack of control over what our children can be exposed to online. These parents observe their children taking to the intentionally addictive qualities of social media and digital platforms, and I deeply understand the desire to solve this problem. I, like these Goldstein parents, wish that this legislation was the silver bullet we wish for to comprehensively protect our kids from online harm. If it was, I would be its strongest advocate. But, unfortunately, this legislation is nowhere close to what's needed to meaningfully protect our kids in their online experiences. The true objective of the legislation is not to make social media safe by design but to make parents and voters feel like the government is doing something about it.

There's a reason why the government parades this legislation as world leading. Its because no other country wants to do it, and for good reason. Modern policy problems require modern solutions, and, obviously, solutions that will actually work. Of all the many pitfalls in this legislation, at the core of arguments put forward by those who oppose it is this: age gating is, quite simply, not the appropriate mechanism to use in this context. In fact, there is little evidence at all to suggest that an age based regime of this nature could be effective. Instead of protecting our children from social media, it may also expose them, and, possibly, Australia's adult population, to additional risk. The bill does not specify how digital platforms will be expected to verify an individual's age, leaving open the possibility that Australians of various ages may be forced to hand over sensitive personal information to keep their accounts.

I note the opposition amendments, apparently negotiated with the government, on limiting the requirements to hand over things like passport details, and this is a good step. But according to Digital Rights Watch, the information required might still range from birthdates all the way to face prints. I note that the legislation mandates the destruction of such information, but as so many Australians deeply understand, our nation doesn't have a strong track record of late in protecting the privacy of its citizens. I note the member for Mayo's consideration-in-detail amendment in relation to the provision of personal information, which I will support.

It would be remiss of me not to point out the stunning irony of the coalition's sudden reluctance to support this legislation following a tweet by one of big tech's oligarchs, Elon Musk. Misinformation spreading online sought to make the curious link between this legislation and the government's Digital ID Act as part of a conspiracy to shut down free speech on the internet. This bill—one that ostensibly claims to challenge the awesome and concentrated power of big tech—was close to the chopping block via a simple tweet sent from those who stand to be regulated by it. Go figure. Perhaps, though, this is more than simple irony. Perhaps this is testament to the disproportionate political and cultural power that big tech have been allowed to accumulate and the set of circumstances which have led us to this point—the point of debating legislation which even the Prime Minister himself describes as potentially unenforceable.

The power of the platforms and the importance of making them accountable leads me to what we could be doing—should be doing—instead: something that is actually meaningful and will stand the test of time. The government have lots of policy options to choose from here. One only needs to look overseas to see online safety regulatory regimes that are working. But—on brand for this government—once again, their chosen policy was just the lowest-hanging fruit: the path that is simple and political, not the one that will actually make social media safe for our children by design.

Yesterday I tabled, as a private member's bill, a five-pillar regime which amounts to a digital duty of care. If the government wants to make social media safe not just for children but for everybody, then let's take that to a vote right now. This model of legislation has broad support across the parliament and in Australian civil society. It is supported by organisations ranging from the Foundation for Social Health and the Foundation for Alcohol Research and Education to the Human Rights Law Centre and Reset Tech Australia, amongst others.

A fully implemented digital duty of care is what will make social media safe for all Australian kids and adults. Rigorous age verification technology may one day be a supplementary component of such a duty of care, but the age-gating model as proposed in this legislation alone will be starkly inadequate. This is why I move the second reading amendment circulated in my name:

That all words after "That" be omitted with a view to substituting the following words:

"whilst not declining to give the bill a second reading, the House:

(1) notes that:

(a) Australia was once an international world-leader in online safety regulation when the previous Coalition Government enacted the Online Safety Act 2021;

(b) Australia's existing content-based model of online safety regulation was inspired by what was once effective during the era of Broadcast television, newspapers, and radio last century, and that technology has fundamentally changed this information landscape;

(c) Australia has lost its status as a world-leader in online safety regulation following the enactment of ambitious 'systems'-based laws in the European Union and United Kingdom respectively; and

(d) whilst age assurance as a tool for regulators may have some capability to contribute to safer Australian online spaces, it does not change how algorithms operate in any fundamental way;

(2) recognises the promising recent public statements of the Prime Minister and the Minister for Communications expressing the Government's intent to implement a 'Digital Duty of Care' if elected to a second term of Government;

(3) notes that a Duty of Care:

(a) alone is insufficient to meaningfully hold digital platforms and their algorithms to account and make social media safe by design; and

(b) is just one of five necessary elements of equal importance if Australia is to meaningfully make digital spaces safe for Australians of all ages;

(4) calls on the Government to re-claim the mantle of our nation as a world-leader in online safety regulation by enacting a comprehensive 'systems-based legislative framework, this being:

(a) a Duty of Care;

(b) risk assessments;

(c) Risk Mitigation Plans;

(d) genuine transparency measures for the Australian public and our research community; and

(e) enforcement measures proportionate to the risk algorithms pose to Australian society".

Australia was once a world leader in online safety regulation, and we can achieve the government's aim to reclaim that mantle—just not with this bill. Only with safety by design, a duty of care, risk management and mitigation, and a solid incentive to comply will we make digital platforms safe for our kids, ourselves and our communities.

Online safety regulation cannot truly be safe unless it's systemic. It is the systems that must be made to change, not the people. We need to reshape our vision of what online safety looks like and follow the models that are achieving meaningful behavioural change, namely the Digital Services Act and Digital Markets Act in Europe, that impose a duty of care on the companies to do no harm. We won't achieve that by passing a bill that just makes us all feel better.

Comments

No comments