House debates
Monday, 25 November 2024
Bills
Online Safety Amendment (Digital Duty of Care) Bill 2024; Second Reading
10:16 am
Zoe Daniel (Goldstein, Independent) Share this | Hansard source
I move:
That this bill be now read a second time.
Research from the Foundation for Social Health shows that the emergence of social media correlates with young Australians now being the loneliest cohort in our society.
Previously, it was the over 60s who topped the stats on social isolation, but now it's apparently our most connected generation that feels most disconnected.
Ninety-one per cent of participants in the millennial and gen Z generational cohorts reported that social media has taken away from their real-life interaction with friends, family and community in some way.
Human society has long had a dual relationship with technology. New inventions—particularly those with the potential for transformational change—have carried an inherent but invisible framework of risk and reward.
And since the emergence of social media, we have failed to manage that risk. Now, we're faced with the challenge of retrofitting rules onto enormously powerful societal influencers—the companies that run these platforms.
And I would argue that we should do that not with bandaid measures but with an approach that manages the whole environment where the harm is taking place—and puts the responsibility on the platforms not the users.
Over the last decade or two, as social media and big tech have entrenched their practices and business models, governments—like the former coalition government—have designed 'content' based take-down regulatory models like those previously designed for broadcast media.
It makes sense—broadcast something that doesn't meet community standards—remove it or don't do it again.
But of course those legacy media environments didn't have the algorithm to spread content like wildfire the way information moves on social media now.
In lieu of a better idea, it's understandable that governments returned to what used to work.
And when the former government enacted the Online Safety Act, it was world leading in its ability to combat the spread of harmful content. There was, to put it bluntly, no better idea.
But now there is.
The Online Safety Amendment (Digital Duty of Care) Bill, like the European Digital Services Act and UK's Online Safety Act, is modelled off of the groundbreaking work of Professor Lorna Woods and a four-year project from Carnegie UK Trust.
Core to this model, now implemented in the EU and UK, are five interoperating key elements.
And the core aim—to make the platforms make their spaces safe for us, for our kids, for our communities.
First, the bill implements a singular and overarching statutory duty of care required by digital platforms for the wellbeing of their Australian users.
A duty of care is not only just but essential. It is appropriately broad in scope to ensure that all of the systems, processes and elements of a digital service are captured, including 'dark patterns' and addictive design features.
Now I've been talking about this for some time, and I acknowledge that the government is gradually signalling its intent to move in this direction eventually.
But a duty of care isn't enough on its own. It must be given teeth.
Second and third therefore, the bill frames a complementary scheme of risk assessments and risk mitigation.
Platforms will be regularly required to assess for the risk that their algorithms and other digital systems could have on mental health, the rights of the child, gender based violence and electoral processes, for example.
Platforms will also be required to prepare an accompanying mitigation plan which demonstrates to the government and the people exactly how they will reprogram or change their algorithms and systems to prevent these risks in the first instance.
This is how we make social media safe by design.
Without pressuring digital platforms to make all of their algorithms and systems progressively safer over time, governments around the world will continue to resort to handbrake measures, such as age based prohibition.
Age gating will not make platforms safer. It will maintain the situation where safety is the responsibility of the user—including parents and children—and indeed it may have the unintended consequence of further socially isolating some young people.
Far better, surely, than attempting to lock young people out (without even knowing if or how that can work), to make the space safe.
Fourth, and critically, the bill imposes mandatory transparency measures to ensure the government and public understand and can monitor how algorithmic systems are functioning.
This involves mandatory data access for public-interest research, and regular transparency reporting on key metrics, such as usage statistics and quantitative data on exposure of children to harmful content.
Fifth, this regime must be backed up with enforceability. Digital platforms under this bill will be liable for civil penalties of up to 10 per cent of annual turnover for noncompliance, and their senior management with penalties consistent with securities fraud.
And the bill will require platforms to enable people to reset and switch off the algorithm if they so choose and require the strongest privacy settings by default.
I acknowledge the work of my expert and lived experience Eating Disorders Working Group, which in many ways has been the starting point for my work in this space.
There is no silver bullet for online safety. But this five-element regulatory framework is demonstrably effective, and has become the international best practice.
I commend this bill to the House—because Australia can once again become a world leader in online safety. In the words of Professor Lorna Woods, 'We need a response that is preventative, not palliative.'
Humans designed these systems. And so, where we must, we can redesign them to serve our collective needs, rather than those of Mr Zuckerberg and Mr Musk.
This groundswell here has already begun.
The Foundation for Alcohol Research and Education, the Foundation for Social Health, the Human Rights Law Centre, Reset Tech Australia, AWO Agency, Queensland University of Technology, the Butterfly Foundation and the Black Dog Institute have all joined me in Canberra in recent weeks backing the changes that would occur under this bill.
I urge the government to allow debate and adopt this legislation, and I cede the rest of my time to the member for Kooyong.
No comments