House debates

Thursday, 12 September 2024

Bills

Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024; Second Reading

9:41 am

Photo of Michelle RowlandMichelle Rowland (Greenway, Australian Labor Party, Minister for Communications) Share this | Hansard source

I move:

That this bill be now read a second time.

Introduction

The Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024 will amend the Broadcasting Services Act 1992 to protect Australians from seriously harmful online mis- and disinformation. The bill also makes consequential amendments to the Australian Communications and Media Authority Act 2005, Telecommunications Act 1997 and the Online Safety Act 2021.

In January 2023, the Albanese government committed to providing the Australian Communications and Media Authority (ACMA) with new powers to create transparency and accountability around the efforts of digital platforms to combat mis- and disinformation on their services, while balancing the freedom of expression that is so fundamental to our democracy.

This bill delivers on that promise following an extensive public consultation process on the draft legislation to ensure the ACMA powers meet community and industry expectations.

The bill will increase transparency and accountability

Digital platforms enable end-users in Australia and around the world to connect with family, friends, community groups and business, regardless of geographic distance. While this has brought significant benefits, digital platforms can also serve as a vehicle for the spread of misleading or false information that is seriously harmful to Australians.

The rapid spread of seriously harmful mis- and disinformation poses a significant challenge to the functioning of societies around the world. Democratic countries like Australia rely on the free flow of information to inform public debate, and the integrity, diversity and reliability of information is fundamental to our democratic way of life.

Mis- and disinformation about the stabbing attacks in Bondi Junction and recently in Southport, UK, are just two examples that illustrate the need for digital platforms to do more to prevent and respond to its spread.

There has been a sharp increase in Australians' concern about misinformation, according to the Digital News Report: Australia 2024. This research, by the News and Media Research Centre at the University of Canberra, shows concern has risen to 75 per cent, and is well above the global average.

Similarly, the Australian Media Literacy Alliance report on adult media literacy released in August 2024 highlights that 80 per cent of Australians want the spread of misinformation in Australia to be addressed.

In Australia, the digital platform industry has taken an important first step to address the threats posed by the spread of harmful mis- and disinformation online through the development of the voluntary Australian Code of Practice on Disinformation and Misinformation. But this effort is not enough.

The ACMA has continuously highlighted the need for industry to improve the quality of its monitoring and reporting against the voluntary code's outcomes, noting that a robust performance measurement framework is critical to its success.

The ACMA has found that the transparency reports made under the voluntary code lack consistent, trended, Australia-specific data on the effectiveness of digital platforms' efforts to address mis- and disinformation on their services.

An independent assessment of the 2024 transparency reports noted that improvement of reporting under the code stalled and that signatories failed to meet their code commitment to have internally consistent key performance indicators.

In its 2023 report to government, the ACMA called on industry to take further steps to review the scope of the code and its ability to adapt quickly to technology and service changes. The code has only nine signatories—major digital platforms like X and Telegram are not signatories meaning there are wide gaps in coverage across the digital platform industry.

Digital platforms need to step up to protect Australian users from the threat of seriously harmful mis- and disinformation online. This bill seeks to strengthen the voluntary code by providing a regulatory backstop.

The bill will empower the ACMA to review the effectiveness of digital platform systems and processes and will improve transparency about measures platforms have in place to protect Australians from mis- and disinformation on their services.

The bill will establish a proportionate, graduated and flexible regulatory framework, while at the same time safeguarding the freedom of expression that Australians hold so dear. The bill also ensures that it is digital platforms that remain responsible and accountable for the content they host and promote to Australian users.

To protect freedom of speech, the bill sets a high threshold for the type of mis- and disinformation that digital platforms must combat on their services—that is, it must be reasonably verifiable as false, misleading or deceptive and reasonably likely to cause or contribute to serious harm. The harm must have significant and far-reaching consequences for Australian society, or severe consequences for an individual in Australia.

The types of serious harms in the bill are:

              Core transparency obligations

              The bill will impose core transparency obligations on digital platforms requiring them to be upfront about what they are doing on their services to combat mis- and disinformation.

              Digital platforms will be required to publish their current media literacy plan setting out the measures they will take to enable users to better identify mis- and disinformation. This will empower Australian users to critically engage in the content they view on digital platforms, identify and respond to mis- and disinformation and to make more informed choices about how they engage with content.

              Digital platforms will also be required to publish their current policy approach in relation to mis- and disinformation as well as the results of their risk assessments that identify and assess significant risks relating to mis- and disinformation on their services.

              The bill will also allow the ACMA to create digital platform rules with additional transparency obligations, including in relation to media literacy plans, risk management plans and complaints and dispute handling processes.

              Information gathering and record keeping powers

              The ACMA will have the power to obtain information from digital platforms and make rules that require them to create and retain records relating to mis- and disinformation. This could include requiring digital platform providers to provide periodic reports to the ACMA.

              Information-gathering and record-keeping powers will enhance transparency and allow the regulator to track the progress of digital platforms in addressing mis- and disinformation on their services. This will set a clear expectation that digital platforms must be transparent with the Australian public.

              Importantly, the bill also provides protections for Australian end users. For example, the information-gathering powers will not require individuals to produce information or documents to the ACMA except where they are a platform employee, a content moderator, a fact checker or a person providing services to the provider of the digital platform.

              Code and standard making powers

              Under the bill, the ACMA would have the power to approve codes and make standards to compel digital platform service providers to prevent and respond to mis- and disinformation.

              A code or standard could include obligations to cover matters such as reporting tools, links to authoritative information, support for fact checking and demonetisation of disinformation. Approved codes and standards will be legislative instruments subject to parliamentary scrutiny and disallowance.

              These powers could be used in the event that the ACMA determines that existing industry efforts to combat mis- and disinformation on digital platform services do not provide adequate protection for the Australian community.

              In the event industry efforts to develop or implement an approved code have not been effective, or in urgent and exceptional circumstances, the ACMA would have the power to make an enforceable standard.

              This is consistent with the proportionate and graduated nature of the bill's framework.

              Protections for freedom of expression

              To ensure it strikes the right balance between upholding freedom of expression and combatting mis- and disinformation, the bill has carefully calibrated definitions of serious harms that align with Australia's international human rights obligations.

              The bill does not apply to professional news content or content that could be regarded as parody or satire. It also does not apply to the reasonable dissemination of content that is for academic, artistic, scientific or religious purposes.

              Nothing in the bill enables the ACMA themselves to take down individual pieces of content or user accounts. The bill takes a system-level approach, and digital platforms will remain responsible for managing content on their services.

              Importantly, the bill will enable the ACMA to require digital platforms to be tough on disinformation involving inauthentic behaviour such as bots or troll farms. This type of manipulative behaviour has been a major vector of foreign interference and is an ongoing threat to democracies across the world.

              Penalties, enforcement and review of the legislation

              The bill will enable the ACMA to use a proportionate, graduated and risk based approach to non-compliance and enforcement. This may include the ACMA issuing formal warnings, remedial directions, infringement notices, injunctions as well as pursuing civil penalties, depending on the severity of the action.

              Digital platforms may be subject to civil penalties of up to five per cent of global turnover for breaches of a standard and up to two per cent for codes. These penalties are high. However, they may be necessary in response to egregious and systematic breaches and failure to act.

              The bill requires a triennial review of the operation of the legislation. A report of the review must be tabled in the parliament and must follow a period of public consultation and an assessment of the legislation's impact on freedom of expression.

              In addition, the first review is also required to consider the need for a scheme requiring platforms to give accredited independent researchers access to data relating to mis- and disinformation. This is to enable time for developments in international jurisdictions to inform the appropriateness and effectiveness of this as an additional transparency measure.

              The bill also requires the ACMA to prepare an annual report for tabling in the parliament on the operation of the bill's framework.

              Consultation

              The bill has undergone considerable consultation, with a significant breadth and depth of engagement with key stakeholders, including from the digital platforms industry, legal and civil society groups, media and fact-checking organisations, and research and academic institutions.

              The government thanks these stakeholders for their important contributions to the bill, which have ensured the bill strikes the right balance between protecting Australians from serious harm and upholding freedom of expression that is so integral to our democracy.

              Conclusion

              Through this bill, the Australian government is acting to prevent the spread of mis- and disinformation and the damage it causes to Australian democracy and public safety. Australians clearly expect the government to act to address this growing problem and this is what the Albanese government is doing.

              The bill positions Australia to be at the forefront of tackling this growing international problem—one which threatens to undermine our civic discourse and democratic engagement and participation. This bill ensures that digital platforms are accountable for combatting mis- and disinformation on their services.

              The top priority of the government is to keep its citizens safe. Doing nothing to protect Australians from seriously harmful mis- and disinformation online is simply not an option.

              Debate adjourned.

              Comments

              No comments