House debates
Monday, 18 November 2024
Committees
Social Media and Australian Society Joint Select Committee; Report
6:13 pm
Sharon Claydon (Newcastle, Australian Labor Party) Share this | Link to this | Hansard source
On behalf of the Joint Select Committee on Social Media and Australian Society, I present the committee's report, incorporating a dissenting report, entitled Social media: the good, the bad and the uglyfinal report.
Report made a parliamentary paper in accordance with standing order 39(e).
by leave—This report follows the committee's second interim report that was tabled in the House in October. That report examined how the decision of Meta to abandon deals under the News Media Bargaining Code could influence the provision and consumption of public interest journalism in Australia and give rise to mis and disinformation. It also found that while the code was established in good faith, its implementation has revealed significant shortcomings. Accordingly, the committee made 11 considered recommendations to improve the effectiveness of the code and the sustainability of public interest journalism and digital media in Australia.
This third and final report looks closely at the influences and impacts of social media on Australian society. The committee received 220 submissions throughout the inquiry, conducted 10 public hearings and received additional written responses to many questions on notice. We heard from experts, academics, bureaucrats, big tech companies, advocates, grieving families and young Australians who have grown up with digital technology all their lives. Social media users in Australia are some of the most active in the world, with approximately 81 per cent of all Australians reporting that they were regular users of social media in 2023.
While acknowledging that social media is a huge part of everyday life for most Australians, this final report examines social media in its entirety—the good, the bad and the ugly. The committee heard that social media can be addictive and disruptive to offline activities. It can impact on sleep, lead to mis- and disinformation being shared and consumed and poor mental health and, at its worst, expose vulnerable users to online predators, unhealthy expectations around body image, bullying and sextortion. In some tragic cases, it can lead people to extremely dangerous self-harming behaviours, such as eating disorders and suicidality. We heard that much more needs to be done to protect Australians from these harms, particularly by social media companies themselves, who make money by keeping their users engaged. Social media is never free, because users pay for it with their attention. As one young witness observed, if the product is free then you are the product. This statement reflects the true nature of social media platforms. They provide a free space for community and connection, but in exchange the user is beholden to the business decisions of big tech companies, whose focus is squarely on increasing revenue without consideration for the health and wellbeing of their customer.
This report puts big tech on notice. Social media companies are not immune to the need to have a social licence to operate here in Australia. Participants to the inquiry also painted a picture of how the relationship Australians have with social media is complex and forever evolving. We were told that many Australians, particularly young Australians, enjoyed using social media and, for the most part, they didn't want their access to it to be restricted. As one young person said, social media isn't just a platform; it's a lifeline for connection, information and community for many young people. But we also heard that social media companies use opaque algorithms that keep users scrolling, feeding us what they think we want to see, even if it is harmful. Young people recognise these harms and the vulnerabilities of some users, but they wanted to better protect themselves from harms and have more control over their social media experience to better improve their own health and wellbeing. They want to be able to alter, set or indeed turn off their personal algorithms and recommended systems. They want greater control over the content and paid advertising they see and when they see it, and they want to be active participants in co-designing policy to help improve the safety and accountability of online platforms.
The committee also heard from mental health organisations who told us how social media is used to support people's mental health and wellbeing, with many Australians accessing mental health resources online. But they too agreed that some harms were so extreme that interventions were required and much more should be done to protect users from online harms. Academics and experts spoke to the many different facets of the social media environment, including its harms, noting that there is no one silver bullet that is going to solve this problem. But they were adamant that this shouldn't stop us from taking immediate action to better protect Australian users. We heard that, while legislating an age limit might not be the perfect solution and should certainly not be the only solution, it would provide important breathing space for the implementation of long-term sustainable digital reforms.
Finally, the committee also heard from parents regarding the horrific online harm experienced by their children, with some parents drawing a direct link between the influence and impact of social media use and their child's mental health and wellbeing. We heard about young users suffering eating disorders who are continuously shown social media content that is harmful to their recovery, and we heard about vulnerable young people who are endlessly bullied and sextorted via social media, leading to them taking their own lives.
Parents presenting evidence to the committee pleaded for an increase in the minimum age for children to access social media, noting the current restrictions had failed. Without a legislated minimum age to access social media, parents said, they felt unsupported in their efforts to protect their kids from social harms. They wanted to be able to tell their kids, 'No. It's the law. You have to wait until you're old enough.' And they worried that social media companies would take years to implement reforms that made them responsible for their platforms and for preventing online harms.
While everyone agrees that social media is part of everyday life and will remain so, social media platforms and online services have a key responsibility for the safety of their users. This report makes recommendations for immediate and long-term government action, but it also puts the responsibility back onto big tech, who absolutely must do better. It contains 12 well-defined recommendations that go to the heart of the problem: keeping Australian users safe. The report recommendations include greater enforceability of laws to bring digital platforms under Australian jurisdiction; support for a single and overarching statutory duty of care for digital platforms to ensure Australian users, particularly children, are safe online; effective mandatory data access for independent researchers and public interest organisations, coupled with a rigorous auditing process by appropriate regulators; measures to enable users to have greater control over the content they see by having the ability to alter, reset or turn off their personal algorithms and recommended systems; greater protections for users' personal information; inclusion of young Australians in the co-design process for the regulation of social media; research and data collection provisions that enable evidence based policy development; ongoing education to improve digital competency and online safety skills; built-in safety-by-design principles for current and future platform technology; a transparent complaints mechanism that incorporates a right-of-appeal process; and adequate resourcing for the office of the eSafety Commissioner to discharge its ever-evolving functions. Taken together, these recommendations map a pathway forward for social media reforms in Australia.
The committee notes that in the past two weeks the government has announced the introduction of legislation to make 16 the minimum age of access for social media. Other recent measures include legislation to combat the rise of mis- and disinformation and a landmark scams prevention framework which calls for fines of up to $50 million and requires social media platforms, banks and telecommunications companies to protect Australians from online scams. And just last week the government announced that it will be legislating a digital duty of care to place the onus on digital platforms to proactively keep Australians safe and better prevent online harms in the first place. These actions are part of a suite of government reforms that complement each other and incentivise the design of a safer, healthier digital platforms ecosystem.
The committee strongly supports the 12 recommendations in this final report, along with the recommendations of our second interim report. Collectively, these 23 recommendations map a pathway forward for social media reforms in Australia and put big tech on notice, because social media companies are not immune from the need to have a social licence to operate in Australia.
I would like to sincerely thank the secretariat, who have worked hard to meet the committee's deadlines and have been exceptional in providing support to the committee. Thank you to the committee secretary, Gerry McInally; Aysha Osborne; Natasha Rusjakovski; Michael Perks; Aisha Bottrill; and Jamison Eddington. I would also like to thank—I see her sitting at the table—the former chair of this committee, the member for Jagajaga and assistant minister, for her diligent work in the first iteration of this committee. I also want to acknowledge all of the committee members—some of whom I see sitting in the chamber this evening—who have worked productively with me throughout this inquiry. It doesn't mean we haven't had our challenges and differences, but we have landed in a place where there is a good collective will to ensure improved online safety for all Australian users.
My message is simple: the age of unregulated social media is over. Online safety is paramount and social media platforms must take responsibility to ensure fundamental protections are in place. Social media has a social responsibility for the safety of their users, and this report maps out ways in which they can be held to account, ensuring social media is a safe place for all Australians to find connection, community and reliable information.
6:26 pm
Andrew Wallace (Fisher, Liberal National Party) Share this | Link to this | Hansard source
by leave—I begin by thanking the member for Newcastle for stepping in as chair for the committee and the member for Jagajaga for the work that she did before her elevation. This was a comprehensive inquiry with 220 submissions, 10 public hearings, 58 responses to questions on notice and three reports. The coalition members of the committee welcome the recommendations made by the committee, including a statutory duty of care and a children's online privacy code.
Having said that, I feel that it would be remiss of me if I didn't point out that I think that if the report was left as is it would be seen as a missed opportunity. Coalition members felt compelled to provide not a dissenting report but additional comments, and there were some 16,000 words in that additional commentary, which we believe fills a gap in this report. Simply put, coalition members believe that government members of the committee have missed the opportunity to demonstrate the strong leadership required for the kind of comprehensive reform which social media platforms desperately need.
On 3 January 2018, Dolly Everett took her life as a result of very, very significant bullying. Most of us would remember that day, and I have to say it was a day in my life that I remember well. Shortly after Dolly's passing, I went down to Sydney, New South Wales, and had a meeting with the DIGI group, where all of the social media platforms and big tech are represented. I made the executives of the DIGI group a promise that I would be a thorn in their side for however long they continued to not look after the welfare of Australians and, in particular, the welfare of Australian children. I made them that promise, and today I feel somewhat vindicated that that promise has been, or is being, delivered on.
Big tech companies have proven utterly incapable of protecting their users from harm, and Australians demand strong leadership to hold them to account. I remember walking out of that meeting with the DIGI group thinking that I had just met with big tobacco from the 1960s and the1970s. I was assured that everything was all under control and that platforms were doing everything that they humanly possibly could to protect Australians. We all know that that is not true.
Coalition members were concerned that, despite the significant evidence provided by witnesses, the committee failed to give enough attention to the issues of child safety, foreign interference and mental health, particularly in relation to eating disorders and addiction. A parliamentary inquiry such as this one should freely offer recommendations to government to steer policy and find practical solutions to real-world policy problems with far-reaching consequences.
Coalition members want user-control features which address persuasive design issues. That includes resetting your algorithm, stopping autoplay and infinite scrolling and having the ability to better customise one's social media feeds. Coalition members would like to see greater transparency and reporting requirements in relation to actual or suspected foreign interference or transnational commercial activities. Coalition members would like to see a centre of digital education excellence established to bolster Australia's digital technology and media literacy. Coalition members want a proactive obligation on social media companies to report actual or suspected child sexual abuse and exploitation, regardless of whether end-to-end inscription is used.
Coalition members want a proactive obligation on search engines and similar platforms to report how they are combatting the indexation and dissemination of harmful material. Coalition members want big tech to be held accountable for harmful materials published by connected third-party platforms, including link-in-bio tools. Coalition members want the government to work with experts, youth representatives and lived-experience participants to develop a strategy to improve the online safety and wellbeing of boys, who are disproportionately affected by online harms, particularly in relation to sextortion. Coalition members want the government to invest in the research and development of technology to combat child sexual exploitation.
Coalition members want social media companies to provide regular transparency reports on data collection. Coalition members want regular reports on revenue received from the advertising of regulated and restricted industries like alcohol, gambling, pornography, cigarettes, pharmaceuticals, weight-loss treatments, debt collection and more. Coalition members want the government to adequately resource the eSafety Commissioner and the Australian Centre to Counter Child Exploitation to meet increasing demand on their services. Coalition members want a new parliamentary committee—like other committees before this one—on online safety, artificial intelligence and technology, tasked with responding to and preparing Australia for the growing threats and opportunities in social media technology and AI.
It is imperative that a standing committee be established. This committee is handing down its report today, but it's likely to be out of date in a couple of months time, and having a standing committee which can keep abreast of the constant changes in this area will stand this country in good stead. The jurisdiction that this standing committee should have would also include dating apps, gaming platforms, live-streaming programs, the Metaverse and more.
The coalition members have put together an alternative report to highlight the serious issues which the existing report fails to address. We've nominated 13 additional recommendations to protect Australians online. We've demonstrated, once again, that the coalition is leading the charge on social media reform and online safety. I want to acknowledge the efforts and the receptiveness of the Leader of the Opposition and the shadow minister for communications, David Coleman, and acknowledge their great work in leading social media reform. At this point, I also want to thank my partner in crime, the member for Flinders, for her outstanding work on this committee and in relation to online safety.
The reality is that only the coalition can be trusted to keep kids safe online. Only the coalition has the courage to hold big tech to account. I commend the report and additional comments to the House.
6:35 pm
Sharon Claydon (Newcastle, Australian Labor Party) Share this | Link to this | Hansard source
I move:
That the House take note of the report.
Mike Freelander (Macarthur, Australian Labor Party) Share this | Link to this | Hansard source
The debate is adjourned, and the resumption of the debate will be made an order of the day for the next sitting.