House debates

Monday, 4 November 2024

Committees

Social Media and Australian Society Joint Select Committee; Report

12:17 pm

Photo of Zoe DanielZoe Daniel (Goldstein, Independent) Share this | Hansard source

I ask leave of the House to make a short statement in connection with the report.

Leave granted.

In addition to the chair's comments, the Joint Select Committee on Social Media and Australian Society's second interim report represents how Australia's approach to regulating digital platforms is outdated and begins to show where it's headed. 'Online safety' as a term is often used to describe the scope of the policy space this committee was charged with examining, but many don't have an appreciation for how broad the risk of harm online can be. Some of these risks are immediately obvious; many are less perceptible.

Online safety encompasses mental health, democratic division, online scams, gendered violence and data privacy. But there are adjacent harms that receive less media and popular attention: loneliness and social atomization, internet addiction and choosing to spend increasing amounts of time at home 'doomscrolling' instead of socialising and making friendships in the 'real world'. The transformation of social norms and behaviour is at an unprecedented pace due to the sheer speed at which information travels online. The breadth of these risks warrants a governmental response which is correspondingly broad.

Recommendation 1 of the committee's second interim report calls for the government to establish a digital affairs ministry, which would have overarching responsibility for the challenges and risks relating to digital platforms. Currently, each of the risks I mentioned earlier is segmented across multiple ministerial portfolios. We need coordination across these portfolios—a ministry empowered to make decisions that can encompass the full breadth of risks represented by online safety.

Recommendations 2 and 3 of the interim report propose alternative revenue mechanisms, often referred to as a 'digital platform levy' or a 'big tax'. Revenue could be allocated to support the long-term sustainability of Australian public interest journalism by using a levy of this kind. Considering the issues related to the News Media Bargaining Code, which this report is directed at, this is worthwhile. I also suggest that some of this funding go towards the establishment of an independent body to oversee how big tech systems and processes are operating in Australia. Such a body could be used to accredit researchers and to mediate between them and the digital platforms.

There is precedent for these ideas. Models of how this could work were floated during the design of the European Union's Digital Services Act, which in many ways is the trailblazer in this space. It's critical to note, though, that a tax alone is not enough to achieve meaningful accountability of the digital platforms. Such a mechanism would raise revenue, but it would not compel big tech to change the way algorithms, systems and processes function in any way, which is why recommendation 9 is so important. Here, the committee calls for transparency requirements, like those in the EU Digital Services Act. The DSA, enacted in 2022, represents a step change in best practice digital platform regulation and eclipses the scope of our content focused Online Safety Act.

In Europe, digital platforms under article 40 must open their algorithms to accredited public interest researchers to conduct research. And last week Europe announced how article 40 will work under regulation. Australia's lack of legal protections has resulted in a researcher brain drain to Europe. Experts are deterred from conducting research into algorithms here due to the threat of legal liability. Just look to the United States, for example. Elon Musk has described bodies like the Centre for Countering Digital Hate as criminal organisations and has updated X's terms of service to reflect that sentiment. Transparency measures such as these do not have to be partisan. Indeed, the coalition's dissenting response to the second interim report demonstrates the multipartisan nature of algorithmic transparency.

I draw the House's attention to recommendation 11, which, in a way, lies at the heart of the Australian government's approach to regulation of digital platforms to date. Australia's model of industry coregulation has been described as the 'best and quickest form of regulation'. Quick it may be, but that is meaningless if the regulation it produces has no teeth. Boiled down, coregulation amounts to 'you must show up to the negotiating table' but what you negotiate is your business. I encourage the government to adopt this recommendation and review whether this regulatory approach will serve us in the long term.

If the Bondi stabbings and Wakeley church attack have taught us one thing, it's that Australia's existing regime is not working. It's time to up our ambition and align our Online Safety Act to international best practice. I would add that discourse around age prohibition is not a meaningful or long-term solution. It's time that we reigned in the algorithms and stood up to big tech, and the second interim report's recommendations are a start.

Finally, I agree with the chair, the member for Newcastle's, comments that it's unfortunate the coalition elected to table a dissenting report in this instance. It would be far better if we could agree in a multipartisan fashion on what is a very important area of public policy in this country. It is very front much of mind for a lot of Australians. We have a couple of weeks before we'll deliver the final report, and I would hope that we can negotiate a unanimous position on that.

Debate adjourned.

Comments

No comments