House debates
Monday, 4 November 2024
Committees
Social Media and Australian Society Joint Select Committee; Report
12:01 pm
Sharon Claydon (Newcastle, Australian Labor Party) Share this | Link to this | Hansard source
On behalf of the Joint Select Committee on Social Media and Australian Society I present the committee's report incorporating dissenting reports, entitled Second interim report: digital platforms and the traditional news media.
Report made a parliamentary paper in accordance with standing order 39(e).
by leave—This report follows the committee's first interim report that was tabled in the House on 15 August. The Joint Select Committee on Social Media and Australian Society was appointed to inquire into and report on the influence and impacts of social media on Australian society, the decision of Meta to abandon deals under the news media bargaining code and the important role of Australian journalism, news and public interest media in countering mis- and disinformation on digital platforms. The second interim report focuses specifically on Meta's decision, what this means for access to news in Australia, and the rise in mis- and disinformation on our democracy and society.
To date the committee has received 219 submissions plus additional information and answers to many questions on notice. We've held 10 public hearings and heard from a wide range of witnesses including from Meta, Snap, Google and TikTok, large and small news media providers—including digital only publishers—news media peak bodies, academics, organisations and individuals, and young people who provided the unique insights and experiences as digital natives. We've heard how social media has become an increasingly important source of access to news and public interest journalism for many Australians. For some, it is their only source. We have heard almost universally that access to accurate and timely news media is essential in maintaining a healthy democracy and in combating the rise of mis- and disinformation online.
The news media bargaining code was established by the former coalition government. It was designed to be a mandatory code of conduct to govern commercial relationships between Australian news businesses and digital platforms. The code was designed to make big social media companies such as Meta, who owns both Facebook and Instagram—two of the world's most popular social media platforms—pay local news companies to carry their news. It was designed to help support the sustainability of public interest journalism in Australia, to provide Australians with access to quality online journalism and to address the significant bargaining power imbalance between digital platforms and Australian news businesses.
The committee heard that funding from these deals, while sometimes inequitable between the small and large news organisations, nevertheless supported important news media jobs around Australia, including in regional and remote Australia's areas, where it is expensive to have local journalists employed. For some media organisations, deals made through this code have been a lifeline. But in February 2024 Meta announced that it will not be renewing commercial deals with Australian news media companies, deals worth millions of dollars, claiming that people don't come to their platforms for news anymore. This is despite evidence to the committee which was quite contrary to Meta's claims. We heard from multiple sources, including, most notably, the University of Canberra's 2024 Digital news report: Australia, which indicated that 49 per cent of Australians reported using social media as a source of news, and a remarkable 25 per cent reported it as their only source of news. When you look at gen Z audiences as a discrete cohort, 60 per cent of the gen Z respondents said that they relied on social media as their main source of news.
You can see there's a very different story from those with lived experience in the way they're getting news from the digital platforms and the stories that big tech companies might want to tell Australian people. Under the code, one option, to take action against Meta, would be to designate—that is, to subject it to obligations under the code—but some participants to the inquiry were very concerned that this would cause Meta to sidestep the code completely and refuse to carry news at all, as it has done in Canada. This has illustrated a fundamental flaw and problem with the code. Many participants, particularly those that represented small or social-media-only news organisations, expressed concern that designation and a decision to carry no news would see their businesses decimated, leaving a huge void for both mis- and disinformation to flourish. Despite this, the committee found that the News Media Bargaining Code has still been effective in encouraging financial arrangements, so this report does not advocate for the code to be abolished, at least not for now.
However, evidence clearly showed that further mechanisms are needed to secure the future of public interest journalism as social media companies evolve, and the government is already acting on this. The Albanese Labor government has recently announced a $15 million News Media Relief Program, which will provide grants to eligible regional, independent, suburban, multicultural and First Nations news providers creating news content. We know local and community news outlets play a vital role in keeping Australians up to date and informed. This News Media Relief Program is needed to support public interest journalism and safeguard media diversity in Australia.
I want to go now to some of those recommendations in the report. The second interim report has made 11 considered recommendations aimed to address concerns raised during the inquiry about the News Media Bargaining Code and the sustainability of public interest journalism and digital media. These include exploring alternative revenue mechanisms to supplement the code as well as mechanisms and protocols to guide the fair and transparent distribution of any such revenue. There was a recommendation around establishing a short-term transition fund to help news media businesses to diversify and strengthen alternative income streams and news product offerings. There was a recommendation too around developing appropriate mechanisms and protocols to guide the fair and transparent distribution of revenue arising from any of those new revenue mechanisms which we referred to above.
There was another around establishing a digital media competency fund to enhance digital media literacy, not just of young Australians—although that was well noted throughout the report—but of many other vulnerable at-risk groups in our community, people who are at risk of mis- and disinformation. So increasing and enhancing digital media competency of all Australians is an ambition of this committee. In order to support legislation to combat mis- and disinformation, there was agreement that there was a need to, in fact, have a legislative response to what we see as one of the greatest threats to social harm and to our democracy.
Another recommendation was about improving transparency around the digital platform systems and processes. We heard from many researchers about the need to be able to look under the hood, as they put it, to see what was really happening and to have access to that data, and to be able to make independent critical analysis of that work was not only important to the role of researchers in Australia but also important to informing government and all of us in this place and, indeed, our communities so that we're making informed decisions based on evidence about ways in which we might improve our regulatory frameworks and ways in which, let's face it, big tech companies might start improving their engagements with our community too. There was a recommendation to examine options to respond to the use of algorithms and the recommender systems to deprecate news by digital platforms with significant power. We've asked the government to look at what those options might be.
They're some of the main features of the recommendations from the committee, and, while the committee worked harmoniously throughout this inquiry, I am disappointed that the coalition's chosen to table a dissenting report. The news media bargaining code was an initiative of the former government—credit where credit's due. That was an important piece of legislation. It, in fact, had the support of all this parliament when it was carried. I need to be clear that this committee is not proposing to abolish that code, but the coalition's refusal to acknowledge the need for improvement alongside the need for tighter government regulation to combat serious harmful mis- and disinformation online is contrary to the evidence that we have received from the committee. We know that big tech is constantly evolving and that this means governments must also evolve and adapt. That's why this report recommends some changes to our regulatory frameworks, and the News Media Bargaining Code is just one of the measures at our disposal to hold big tech companies to account.
It's disappointing that the coalition seems wedded to a set-and-forget approach to the code. That is never going to work, and I really hope that we're able to arrive at some sensible position in our next level of reporting to the parliament on our work. It's regretful that the committee therefore wasn't able to arrive at a consensus in this report as was the case when the News Media Bargaining Code came through the parliament and we were able to join together to support that. That was passed by the parliament back in 2021, as I said, with the support of the now government.
Australians are concerned about the impact that social media is having across many areas of the community, including access to accurate and reliable news content. This report highlights the urgent need for robust regulatory frameworks to protect Australians from the harmful effects of social media. We, of course, want to make sure that all Australians are safe online, with a particular focus on ensuring that children are safe. Australians have the right to access high-quality news content on social media, which is what the 11 recommendations of this report reflect. Throughout the remainder of the inquiry, the committee will maintain its strong view that we need to hold big tech to account, that we need to have a solid legislative response to combat mis- and disinformation and that we need to better protect Australians and especially young Australians from social media harm.
This inquiry's final report is due to be tabled on or before 18 November. We will consider issues relating to online safety, algorithms and recommender systems, as well as impacts on the mental health of users, the lack of accountability of social media platforms and age assurance. These are important issues that the committee is taking very seriously. Australians enjoy using social media, and we know it's not going anywhere, but we want to know that we are all safe online and that what we are reading and seeing in the news is accurate.
I want to pay special thanks to the secretariat who continues to work hard to look to table the final report in just a few weeks. Thanks to the committee secretary, Gerry McInally, Aysha Osborne, Michael Perks, Aisha Bottrill and Jamison Eddington in particular. I thank all members of the committee. I can see that I'm joined in the House by the member for Goldstein, who is a very active participant, and I thank her for her insights and camaraderie in work before the committee. I want to give a shout-out to the deputy chair from the other house, Senator Sarah Hanson-Young, for her collaborative approach to working on some of the most significant problems facing the Australian people today. I look forward to the committee working constructively together in this final stage of our inquiry and reporting back to this House in the very near future.
12:17 pm
Zoe Daniel (Goldstein, Independent) Share this | Link to this | Hansard source
I ask leave of the House to make a short statement in connection with the report.
Leave granted.
In addition to the chair's comments, the Joint Select Committee on Social Media and Australian Society's second interim report represents how Australia's approach to regulating digital platforms is outdated and begins to show where it's headed. 'Online safety' as a term is often used to describe the scope of the policy space this committee was charged with examining, but many don't have an appreciation for how broad the risk of harm online can be. Some of these risks are immediately obvious; many are less perceptible.
Online safety encompasses mental health, democratic division, online scams, gendered violence and data privacy. But there are adjacent harms that receive less media and popular attention: loneliness and social atomization, internet addiction and choosing to spend increasing amounts of time at home 'doomscrolling' instead of socialising and making friendships in the 'real world'. The transformation of social norms and behaviour is at an unprecedented pace due to the sheer speed at which information travels online. The breadth of these risks warrants a governmental response which is correspondingly broad.
Recommendation 1 of the committee's second interim report calls for the government to establish a digital affairs ministry, which would have overarching responsibility for the challenges and risks relating to digital platforms. Currently, each of the risks I mentioned earlier is segmented across multiple ministerial portfolios. We need coordination across these portfolios—a ministry empowered to make decisions that can encompass the full breadth of risks represented by online safety.
Recommendations 2 and 3 of the interim report propose alternative revenue mechanisms, often referred to as a 'digital platform levy' or a 'big tax'. Revenue could be allocated to support the long-term sustainability of Australian public interest journalism by using a levy of this kind. Considering the issues related to the News Media Bargaining Code, which this report is directed at, this is worthwhile. I also suggest that some of this funding go towards the establishment of an independent body to oversee how big tech systems and processes are operating in Australia. Such a body could be used to accredit researchers and to mediate between them and the digital platforms.
There is precedent for these ideas. Models of how this could work were floated during the design of the European Union's Digital Services Act, which in many ways is the trailblazer in this space. It's critical to note, though, that a tax alone is not enough to achieve meaningful accountability of the digital platforms. Such a mechanism would raise revenue, but it would not compel big tech to change the way algorithms, systems and processes function in any way, which is why recommendation 9 is so important. Here, the committee calls for transparency requirements, like those in the EU Digital Services Act. The DSA, enacted in 2022, represents a step change in best practice digital platform regulation and eclipses the scope of our content focused Online Safety Act.
In Europe, digital platforms under article 40 must open their algorithms to accredited public interest researchers to conduct research. And last week Europe announced how article 40 will work under regulation. Australia's lack of legal protections has resulted in a researcher brain drain to Europe. Experts are deterred from conducting research into algorithms here due to the threat of legal liability. Just look to the United States, for example. Elon Musk has described bodies like the Centre for Countering Digital Hate as criminal organisations and has updated X's terms of service to reflect that sentiment. Transparency measures such as these do not have to be partisan. Indeed, the coalition's dissenting response to the second interim report demonstrates the multipartisan nature of algorithmic transparency.
I draw the House's attention to recommendation 11, which, in a way, lies at the heart of the Australian government's approach to regulation of digital platforms to date. Australia's model of industry coregulation has been described as the 'best and quickest form of regulation'. Quick it may be, but that is meaningless if the regulation it produces has no teeth. Boiled down, coregulation amounts to 'you must show up to the negotiating table' but what you negotiate is your business. I encourage the government to adopt this recommendation and review whether this regulatory approach will serve us in the long term.
If the Bondi stabbings and Wakeley church attack have taught us one thing, it's that Australia's existing regime is not working. It's time to up our ambition and align our Online Safety Act to international best practice. I would add that discourse around age prohibition is not a meaningful or long-term solution. It's time that we reigned in the algorithms and stood up to big tech, and the second interim report's recommendations are a start.
Finally, I agree with the chair, the member for Newcastle's, comments that it's unfortunate the coalition elected to table a dissenting report in this instance. It would be far better if we could agree in a multipartisan fashion on what is a very important area of public policy in this country. It is very front much of mind for a lot of Australians. We have a couple of weeks before we'll deliver the final report, and I would hope that we can negotiate a unanimous position on that.
Debate adjourned.