House debates

Monday, 25 November 2024

Bills

Online Safety Amendment (Digital Duty of Care) Bill 2024; Second Reading

10:16 am

Photo of Zoe DanielZoe Daniel (Goldstein, Independent) Share this | | Hansard source

I move:

That this bill be now read a second time.

Research from the Foundation for Social Health shows that the emergence of social media correlates with young Australians now being the loneliest cohort in our society.

Previously, it was the over 60s who topped the stats on social isolation, but now it's apparently our most connected generation that feels most disconnected.

Ninety-one per cent of participants in the millennial and gen Z generational cohorts reported that social media has taken away from their real-life interaction with friends, family and community in some way.

Human society has long had a dual relationship with technology. New inventions—particularly those with the potential for transformational change—have carried an inherent but invisible framework of risk and reward.

And since the emergence of social media, we have failed to manage that risk. Now, we're faced with the challenge of retrofitting rules onto enormously powerful societal influencers—the companies that run these platforms.

And I would argue that we should do that not with bandaid measures but with an approach that manages the whole environment where the harm is taking place—and puts the responsibility on the platforms not the users.

Over the last decade or two, as social media and big tech have entrenched their practices and business models, governments—like the former coalition government—have designed 'content' based take-down regulatory models like those previously designed for broadcast media.

It makes sense—broadcast something that doesn't meet community standards—remove it or don't do it again.

But of course those legacy media environments didn't have the algorithm to spread content like wildfire the way information moves on social media now.

In lieu of a better idea, it's understandable that governments returned to what used to work.

And when the former government enacted the Online Safety Act, it was world leading in its ability to combat the spread of harmful content. There was, to put it bluntly, no better idea.

But now there is.

The Online Safety Amendment (Digital Duty of Care) Bill, like the European Digital Services Act and UK's Online Safety Act, is modelled off of the groundbreaking work of Professor Lorna Woods and a four-year project from Carnegie UK Trust.

Core to this model, now implemented in the EU and UK, are five interoperating key elements.

And the core aim—to make the platforms make their spaces safe for us, for our kids, for our communities.

First, the bill implements a singular and overarching statutory duty of care required by digital platforms for the wellbeing of their Australian users.

A duty of care is not only just but essential. It is appropriately broad in scope to ensure that all of the systems, processes and elements of a digital service are captured, including 'dark patterns' and addictive design features.

Now I've been talking about this for some time, and I acknowledge that the government is gradually signalling its intent to move in this direction eventually.

But a duty of care isn't enough on its own. It must be given teeth.

Second and third therefore, the bill frames a complementary scheme of risk assessments and risk mitigation.

Platforms will be regularly required to assess for the risk that their algorithms and other digital systems could have on mental health, the rights of the child, gender based violence and electoral processes, for example.

Platforms will also be required to prepare an accompanying mitigation plan which demonstrates to the government and the people exactly how they will reprogram or change their algorithms and systems to prevent these risks in the first instance.

This is how we make social media safe by design.

Without pressuring digital platforms to make all of their algorithms and systems progressively safer over time, governments around the world will continue to resort to handbrake measures, such as age based prohibition.

Age gating will not make platforms safer. It will maintain the situation where safety is the responsibility of the user—including parents and children—and indeed it may have the unintended consequence of further socially isolating some young people.

Far better, surely, than attempting to lock young people out (without even knowing if or how that can work), to make the space safe.

Fourth, and critically, the bill imposes mandatory transparency measures to ensure the government and public understand and can monitor how algorithmic systems are functioning.

This involves mandatory data access for public-interest research, and regular transparency reporting on key metrics, such as usage statistics and quantitative data on exposure of children to harmful content.

Fifth, this regime must be backed up with enforceability. Digital platforms under this bill will be liable for civil penalties of up to 10 per cent of annual turnover for noncompliance, and their senior management with penalties consistent with securities fraud.

And the bill will require platforms to enable people to reset and switch off the algorithm if they so choose and require the strongest privacy settings by default.

I acknowledge the work of my expert and lived experience Eating Disorders Working Group, which in many ways has been the starting point for my work in this space.

There is no silver bullet for online safety. But this five-element regulatory framework is demonstrably effective, and has become the international best practice.

I commend this bill to the House—because Australia can once again become a world leader in online safety. In the words of Professor Lorna Woods, 'We need a response that is preventative, not palliative.'

Humans designed these systems. And so, where we must, we can redesign them to serve our collective needs, rather than those of Mr Zuckerberg and Mr Musk.

This groundswell here has already begun.

The Foundation for Alcohol Research and Education, the Foundation for Social Health, the Human Rights Law Centre, Reset Tech Australia, AWO Agency, Queensland University of Technology, the Butterfly Foundation and the Black Dog Institute have all joined me in Canberra in recent weeks backing the changes that would occur under this bill.

I urge the government to allow debate and adopt this legislation, and I cede the rest of my time to the member for Kooyong.

Photo of Milton DickMilton Dick (Speaker) Share this | | Hansard source

Is the motion seconded?

10:19 am

Photo of Monique RyanMonique Ryan (Kooyong, Independent) Share this | | Hansard source

I second the motion. Social media has transformed how humans spend their time and how we communicate. The online world is, for many young people, a preferred place to access information, to build social and technical skills, to connect with our families and friends, to learn about the world, to relax and to play. These opportunities are really important in the transition to adulthood, but social media has well documented and significant risks for children and young people. These risks do need to be addressed by government.

In recent months both major political parties in Australia have rushed to ban social media access for young people. Their proposals are broad in scope but short on detail. They abdicate all responsibility to digital providers. They are not evidence based, they risk being utterly ineffectual and they risk potentially significant unintended consequences. The UN Committee on the Rights of the Child states:

National policies should be aimed at providing children with the opportunity to benefit from engaging with the digital environment and ensuring their safe access to it.

Whether bans are practically possible remains debatable. There are significant concerns regarding privacy, data and consent. Bans could well create more risk for children who still use platforms, because they will remove the incentives to ensure robust child safety features for those younger users who evade age assurance measures. Bans will not improve those products which children will still be allowed to use.

What is the problem here? The fact that we in this country allow billion-dollar companies to market unsafe digital products, or the fact that some of the people who use those products are teenagers? Experts have suggested that we should look at more targeted interventions rather than rushing to poorly considered, technically challenging blanket bans on social media access. Systemic regulation can drive up safety and privacy standards on all platforms for all children, and this approach has been supported by expert groups in mental health, digital literacy and child psychology.

For that reason, I am happy to support the member for Goldstein's proposal to amend the Online Safety Act to impose an overarching standard of care for large providers. This will mandate risk assessments and risk mitigation plans, mandatory transparency and reporting, and stringent enforcement mechanisms. It will protect our children. Something worth doing is worth doing well, in an evidence-balanced way. If the government accepts this bill today, we will start protecting young children immediately. I commend it to the House.

Photo of Milton DickMilton Dick (Speaker) Share this | | Hansard source

The time allotted for this debate has expired. The debate is adjourned, and the resumption of the debate will be made an order of the day for the next sitting.