House debates
Wednesday, 6 November 2024
Bills
Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024; Second Reading
7:59 pm
Monique Ryan (Kooyong, Independent) Share this | Link to this | Hansard source
The Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024 demands careful consideration. While the government's desire to tackle the harmful effects of misinformation and disinformation in our media is both important and worthy, its somewhat piecemeal approach to the issue raises concerns both about the potential limits of its effectiveness and about its potential to limit freedom of expression in this country. Misinformation and disinformation can have devastating effects on human rights, on social cohesion and on democratic processes. Australians are increasingly concerned about them.
The rise of internet technologies has amplified diverse voices within our society. These technologies have enabled new economic opportunities for many. They have democratised access to information and education. They have given all of us, globally, new ways to connect together. However, internet technologies have also enabled the rapid spread of abhorrent and illegal materials. Hate speech, misinformation, disinformation and child exploitation content have contributed to an erosion of trust in our media and democracy. The vulnerable, children, women, Indigenous women and men, people of colour and LGBTQIA+ individuals are particularly targeted by harmful content, harassment and hate speech.
The digital providers have failed to voluntarily address the aspects of those platforms which enabled the spread of harmful content—features such as recommender systems, dark patterns, invasive data harvesting, ineffective and vague content moderation policies, addictive design features and opaque mechanisms for reporting misconduct or abuse online. There have in fact been no commercial incentives for the platforms to regulate misinformation and disinformation; in fact, it's the opposite. Inflammatory content drives engagement, which in turn drives profits. The digital platforms are most able to reduce their harmful content. They're the ones who can do that most easily. They have to be made responsible for mitigating the harms that they actively enable.
Effective citizenship requires access to reliable information. Democracy cannot flourish without a diversity of media sources and a regulatory regime that protects consumers against the spread of misinformation, which ensures trust in what we hear, read and see. The erosion of trust in our media is a serious concern, and we do need to address it, but we also need to be conscious of the need to ensure that any measures taken to address misinformation and disinformation do not, paradoxically, exacerbate the toxic erosion of trust and the feelings of disenfranchisement expressed by so many in the population.
I'm particularly concerned about this bill's definitions of 'misinformation' and 'disinformation'. It's very important to differentiate between the unintentional and deliberate spread of false information, but the current wording of this legislation lacks clarity. Drawing a clear line between truth and falsehood is not always simple. There may be legitimate differences in opinion as to how content should be characterised. We have to ensure that these definitions are robust enough to capture harmful content and significant forms of bad action by bad actors but not so broad as to capture legitimate political expression, robust and open expressions and discussions of differences of opinion and scientific debate. There are legitimate fears that the bill could be used by politicians or government officials to suppress dissent and control public narratives.
My constituents are concerned that the government may not be a reliable judge of truth, that it might exempt itself from the same scrutiny that applies to others. Misleading content is published in print and online every second of every day in every region of the world. The terms misinformation and disinformation are vastly overused in public discourse. Too often they are employed by both the far left and the far right to discount opposing viewpoints without engaging in reasoned debate. We all know that, we recognise it and we try to address it, but many people struggle with giving our government the ability to decide what is and what is not true. Many Australians lack faith in the ability of digital platforms to adjudicate on such matters with sensitivity and rigour.
I've also heard from my constituents concerns that this bill grants excessive powers to the Australian Communications and Media Authority and to eSafety Commissioner, allowing them to regulate and censor online content. There are valid concerns that these authorities, were they to act in an overzealous fashion, could silence legitimate criticism or opposition. In response to such criticism and other feedback from the Human Rights Commissioner and other legal experts, the government has refined its definition of 'serious harm' such that it must now constitute significant and far-reaching consequences. Health related content is only in the scope of this legislation where it relates to public health or preventive health measures. While the ACMA retains broad powers under this bill, it cannot ask that accounts be blocked or posts removed if they are simply posting misinformation unintentionally, unless it comes from bots.
The bill does not deal with the dissemination of content which may be incorrect or false, only that which is regarded as seriously harmful and reasonably verifiable as false, misleading or deceptive. In that, the law remains piecemeal. It doesn't cover professional news, it doesn't cover government communications, art, satire or religious content. Hate directed at specific groups has to meet the legal bar of vilification to fall under the remit of this legislation.
We need to consider the complexities involved in platforms' content moderation systems as part of these oversight strategies. Recent experience has shown that automated oversight processes, such as those seen in the robodebt scheme, are prone to error when tasked with interpreting complex human behaviour, intention and meaning. The outcomes can be both unintended and devastating. As we saw with robodebt, automation can allow organisations to cut costs, at the expense of accountability and transparency, prioritising efficiency over justice and oversight.
Applying this to the media context, automation in content moderation raises a severe risk of algorithmic overreach. Imagine AI algorithms trained to detect what is deemed harmful or misleading content across Australia's diverse media landscape. In practice, this could well result in uneven, opaque and arbitrary systems of content regulation. The inherent subjectivity in defining misinformation and disinformation makes it especially unsuited to the blunt instrument of algorithmic enforcement. Equally, the proposed dependence on automated systems could well foster a sense of faithless moderation. When individuals are subject to opaque and unaccountable decision-making processes, they are left feeling powerless to challenge or understand the rationale behind any content removal. Moreover, as we've seen with the gradual expansion of surveillance technologies, there's a real risk that these automated content moderation systems, which are initially presented as narrowly targeted solutions, could incrementally expand their reach over time, potentially ultimately encroaching upon a wider spectrum of online expression.
I'm glad that the legislation has been amended now to require that platforms grant access to approved independent researchers, such as academic institutions, not-for-profit entities and other non-commercial researchers. Those researchers should be given immunities to protect them from the sorts of attacks that have recently occurred with X Corp's litigation against the Center For Countering Digital Hate and Meta's actions against CrowdTangle and Reset Tech Australia.
To my mind, the greatest weakness of this piece of legislation is its carve out for the dissemination of professional news content, which it defines as that produced by or for newspapers, magazines, television, radio and websites. We expect with this legislation that Meta and Instagram will police their content, but we as a country remain happy to accept the frequent and deliberate misleading of Australians by professional media sources, who, in my mind, are worse because they operate on a much larger scale in a mainstream format, and they rarely receive anything other than the mildest admonishment from the media regulator.
Watch Sky, read the West Australian, listen to the dross on 2GB or KIIS FM, then go to Instagram, TikTok or Facebook. On which platforms are women more often subjected to overt misogyny? In which formats are Muslims and Indigenous Australians more often subjected to racism and to bigotry? Who is more likely to attack trans people and kids in the Australian youth justice system? Russian bots or Sky presenters? Who has published the text messages of alleged rape victims? Who has outed women who've made confidential sexual harassment claims? Who has falsely named an innocent individual in Sydney as a mass murderer? Which media outlets in this country most often exhibit clear, intentional political bias reflecting the editorial influence of their owners? The answer, sadly, is the mainstream media not only in the press but also on digital platforms. This is not to say that the purely digital platforms are not also guilty of these sorts of behaviours. It's just that, in this shameful race to the bottom, regulation of all forms of media in this country is deeply problematic. We also have the recent example of the referendum debate, which, both online and in the mainstream media, became contaminated by falsehoods and by lies, disguised as truth and repeated and amplified by the mainstream media just as much as they were by the digital platforms.
In 2020 more than half a million Australians signed a petition calling for the establishment of a royal commission into the Australian media landscape. A subsequent Senate inquiry concluded that the Australian regulatory environment for all forms of news media is weak, fragmented and inconsistent in its governance arrangements and in standards across platforms. That inquiry found that focusing on the internet platforms alone would not resolve the grave problems in Australia's media sector. It recommended consideration of a platform neutral, single news regulator. In 2021 the Google owned platform YouTube suspended mainstream broadcaster Sky News for broadcasting medical misinformation about the COVID-19 pandemic. In that instance, a private company was able to act swiftly and effectively to protect the public from misinformation, but ACMA, the purported media regulator, was not.
When she introduced this legislation, the minister referenced the false accusations made against an innocent individual after the Bondi Junction stabbings. Those accusations were made and amplified by Seven Network every bit as much as they were by the digital platforms. For years we have seen that ACMA is ineffectual and toothless. A statement of ministerial expectations regarding its actions might well give strength to those who are unsure whether or not to support this bill.
If we pass this law but do not act on the manifest deficiencies of mainstream media in this country, we are kidding ourselves that we're making any difference to public trust in this country's media landscape. We still need a judicial inquiry into media regulation in this country—an inquiry which should consider a single, independent media regulator to harmonise news media standards and oversee an effective process for remedying which would likely result in a much more effective process for remedying, which would consider complaints from all parts of our media landscape, than would result from the legislation that we're debating now—which would do away with duplication and inadequacy.
In opposition this government was supportive of the concept of meaningful media reform. Now that it's in power, it's happy to act on digital platforms to limit their forays into misinformation and disinformation, while also threatening to impose financial levies on those platforms to prop up mainstream media. This is at a time when it is also actively protecting the advertising streams of mainstream media through its refusal to ban gambling advertising, which it knows is harming Australians and which it knows is allowing the mainstream media to behave in ways at odds with the standards that we are proposing in this legislation.
In its current form, with these limitations, this legislation will not do enough. It won't protect democracy in this country, as we so desperately need. It will be just another half-effective half-measure; an inch when we need a mile; a gesture when we need a stand.
8:13 pm
Jenny Ware (Hughes, Liberal Party) Share this | Link to this | Hansard source
I rise to speak on the Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024. In the 2½ years since I've been elected, I've had a lot—hundreds and hundreds; probably thousands—of emails from constituents, but I have been completely overwhelmed by the number of emails and meetings and correspondence that I've had in relation to misinformation and disinformation.
This bill is a revised version of the exposure draft misinformation bill, which was published and then withdrawn in what we say was a humiliating backdown last year by the Minister for Communications, and it followed overwhelming opposition from a wide range of groups, including human rights groups, civil liberties groups, leading lawyers and proponents of free speech. More than 20,000 Australians made submissions and comments opposing that original bill. While this bill has the stated aim of reducing the spread of seriously harmful misinformation and disinformation on digital communications platforms, there are so many flaws in this bill. It has not been well thought out. We've heard from a broad cross-section of sitting members in this place about their issues with this, and I think that the communications minister—with all due respect—needs to go back to the drawing board.
It's unclear, for example, whether the bill will operate in a manner compatible with Australia's international human rights obligations related to freedom of expression. The definitions of 'misinformation' and 'disinformation' create some uncertainty as to the breadth of content that is captured. The bill introduces supposed transparency requirements for certain digital communication platforms related to misinformation and disinformation, including obligations to publish information on risk management actions, media literacy plans and complaints processes. However, the bill provides the Australian Communications and Media Authority, or ACMA, with new powers to create digital platform rules that require digital communication platforms to report and keep records on certain matters related to misinformation and disinformation, providing ACMA with a graduated set of powers in relation to the development and registration of industry misinformation codes and misinformation standards. What we have here is a lot of power being put into a bureaucracy.
The bill has been referred to the Senate Environment and Communications Legislation Committee for inquiry, with a reporting date of 25 November 2024. Both the Senate Scrutiny of Bills Committee and the Parliamentary Joint Committee on Human Rights have raised concerns with this bill. While the stated purpose is to amend the Broadcasting Services Act 1992 to facilitate the introduction of rules, requirements, codes and standards for digital communication platform providers, aiming to reduce the spread of seriously harmful misinformation and disinformation, as always with legislation, the devil is very much in the detail. The bill is supposed to respond to recommendations made in a 2021 review of the voluntary industry Australian Code of Practice on Disinformation and Misinformation. As I've said before, it's supposed to be a refined version of an earlier exposure draft bill that was released by the Labor government last year.
There is significant concern in my electorate, as I've highlighted, as well as throughout our country and internationally, about misinformation. The University of Canberra published the Digital news report earlier this year and found that 75 per cent of Australians are concerned about online misinformation. That was up from 65 per cent in 2016 and well above the global average of 54 per cent. The Human Rights Council of the General Assembly of the United Nations adopted a resolution on the role of states in countering the negative impact of disinformation on the enjoyment and realisation of human rights.
We live in a world now that is very different to that of previous generations, where people received their news through hardcopy newspapers and magazines. These days, most of the under-30s that I know, for example, have never looked at a broadsheet newspaper, unless they have been down to their grandparents' house. Similarly, they don't even follow those news platforms on their phone. They are far more inclined to pick up all of their news through various social noticeboards or through Instagram, TikTok, Snapchat and various other platforms. That means that we are living in a very different world. As policymakers we have to look at what is and is not acceptable to be on these platforms, having regard to the overwhelming desire that we have for freedom of speech. This is an issue that the former coalition government dealt with. The Australian Competition and Consumer Commission's digital platforms inquiry in 2019, for example, highlighted the significant risks posed by increasing prevalence of misinformation and disinformation shared on digital platforms.
Speaking of recent contemporary events, I think we particularly saw a level of fear through the pandemic. There were certainly levels of significant misinformation and disinformation put out during that period. Like me, Deputy Speaker Freelander, you no doubt receive emails from constituents and from people who aren't constituents about a range of concerns that they have. They can be referred to as conspiracy theories. They can relate to everything from 5G to chemtrails and things like that. This is a serious issue, and it certainly needs to be addressed.
On our side we say that there are probably three or four key issues that we have with the government's proposed legislation. First of all, it acts as incentive for digital platforms to regulate free speech. We say this because digital platforms risk big fines if they do not comply with digital platform rules or industry codes and misinformation standards. ACMA will determine whether or not a digital platform is compliant with the legislation. That, I think, causes concern when it is just one bureaucracy that is looking at whether or not something is misinformation or disinformation. In doing so, it is likely that the platforms themselves will self-censor the legitimately held views of Australians.
There is also a very broad definition of 'misinformation'. The bill captures opinions that are held in good faith. There is no requirement that the maker of the statement is acting maliciously or setting out to deceive people. If something is considered to be 'reasonably verifiable' as misleading and 'reasonably likely to cause or contribute to serious harm', it can be captured as misinformation. So I think there is an issue when there does not need to be any intent proven in regard to the misinformation.
There is also concern around the broad definition of 'serious harm'. A large amount of material could be captured as serious harm because some of the topics on which misinformation can be encapsulated as serious harm include elections, referendums, public health and preventive health measures and imminent harm to the economy or financial markets. So material on a broad range of topics could be encapsulated and then, in fact, shut down by social media platforms.
Another concern is the level of power that this gives the minister. The communications minister has existing powers which could now be used under this bill to order ACMA to conduct specific investigations under the misinformation bill. The minister could also order public hearings. The only constraint on that power is it:
… cannot relate to particular content posted on a digital communications platform by a single end-user identifiable by the ACMA.
So the power seems to be very wide and potentially very open to political abuse. We say that this is completely inconsistent with basic democratic values. The minister under this bill also has power to exempt digital platforms. A digital site that has politics perhaps less favoured by whichever government is in power at the time could be excluded simply on the basis that the government of the day does not like its particular politics.
I turn to some of the comments that have come through to me from my constituents. As I said, I was quite overwhelmed by the volume of correspondence on this one issue. It's probably the most correspondence I've had on any one piece of government legislation in 2½ years. I will iterate some of them. Robert from Illawong said:
The threat of severe penalties for digital platform providers who do not comply is alarming and is likely to result in over-censorship.
Damian from Engadine said:
The … Bill compromises the integrity of social media by granting the eSafety Commissioner increased authority, which has been used to further censor content in response to X Corp's principled stance.
David from Engadine said:
Another concern is the definitions of the terms "Serious Harm", "Misinformation", "Disinformation" and "Truth". Who decides what these terms apply to, and indeed, whether they can even be adequately defined in practice?
These definitions are highly subjective, and who decides what is or is not truth in these times?
Lance from Alfords Point said:
This, in my mind and most of my friends, is the most dangerous and Orwellian Bill put forward in Australia. This is a direct assault on our freedom of speech and by default on our freedom.
Tracey from Wattle Grove said:
This Bill would further constrain free speech, going against the principles of the Internation Covenant of Civil and Political Rights.
Paul from Illawong said:
The Bill could suppress conservative, religious or minority viewpoints on various issues.
Andrew from Bonnet Bay said:
This bill appears to have the sole purpose of suppressing criticism of government.
Ross from Bangor said:
Civil penalties against digital platforms may stifle public discourse and disproportionately affect certain viewpoints.
Tanya from Woronora said:
The government needs to stop attacking our liberties and freedoms and trying to "protect" us by introducing legislation that threatens the very heart of our democracy that our ancestors fought so hard for us to enjoy.
Sarah from Jannali said:
I am concerned about the Government and mainstream media being exempt—
from this bill.'
Jackie from Sutherland said:
This bill represents complete government overreach. More than that, it is a path to authoritarianism.
Marjon from Como said:
Freedom of speech is the bedrock of a free and fair society. It is a fundamental value of Australia. Despite denials to the contrary, the proposed bill will erode this fundamental human right.
Steve from Bangor said:
The draft exposure says it is about keeping Australians safe from "harm". But the government will cause harm by silencing Australians! The Commonwealth is responsible for protecting the rights of the individual, not suppressing them.
Cameron from Holsworthy said:
This legislation if passed will also jeopardize any future on-line discussions on any subject the government of the day does not want the general population to discuss contrary to the government of the day's beliefs.
Annetta from Bundeena said:
Even social media companies such as Meta and Twitter have voiced their concerns over the proposed law because it's easy to see how the power given to the Australian Communications and Media Authority could be abused.
Thomas from Heathcote said:
To quote John Stuart Mill: "All silencing of discussion is an assumption of infallibility."
It is not just the coalition and it's not just members of the crossbench that have issues with this legislation. These are just a couple of examples from my electorate. In all of the circumstances, I join with other members of the coalition in saying that I cannot support the legislation as it's currently drafted.
8:29 pm
Kate Chaney (Curtin, Independent) Share this | Link to this | Hansard source
The Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024 addresses misinformation and disinformation that causes serious harm to public health, groups of people or elections. It says that digital platforms have to disclose what they're doing about mis- and disinformation and provide information to prove it, and, if they're not doing enough, it gives ACMA the power to require them to do more. A range of limits have been put in place to reduce the potential for infringing freedom of expression. Thanks to some amendments to which the government's agreed, I think this is a step in the right direction in meeting my community's desire to see digital platforms held to account for the harm that they can enable.
Seriously harmful online misinformation and disinformation pose a safety and wellbeing threat to Australians, as well as undermining trust in democracy. A number of people in my Curtin community have expressed real concern about online safety and harm generally. This was reinforced to me by a strong response to my survey on a review of the Online Safety Act. My community welcomed better regulation on current and emerging harms and technologies, and supported greater responsibility being placed on digital platforms through a statutory duty of care. This is consistent with research by the independent Reset Tech Australia that commissioned a YouGov poll in April 2024 and found that 93 per cent of people agreed that social media companies should have a duty of care to take reasonable care of their users. While it doesn't create a duty of care, I understand that this bill is one of a range of reforms the government intends to propose in the broad online safety and big tech space.
Voluntary self-regulation has not worked. The fact is, we need to hold to account the companies that allow misinformation and disinformation to be distributed. We need strong processes to ensure our communities are protected. The concept of better regulating digital platforms from seriously harmful misinformation and disinformation is not novel, and has broad in-principle support. The government is attempting to do it here, and the coalition's 2022 election platform included a commitment to increase the information-gathering and enforcement powers of the media regulator to combat harmful online misinformation and disinformation. I agree it needs to be done too, with transparency and accountability being at the forefront. This is consistent with my fair and transparent elections private member's bill, where one of my priorities was seeking to ban lies in political advertising.
The need to regulate harmful, false communications is particularly pertinent today, of all days, as we watch the US election in a febrile atmosphere of a divided populous and online echo chambers. But how we regulate misinformation is a complex challenge. How do we make sure the communications we receive are truthful, while also making sure people can continue to communicate easily? How do we strike the right balance between combating misinformation and disinformation on the one hand, whilst sufficiently protecting freedom of expression on the other?
The broad aim of this bill is to incentivise digital communications platform providers to have robust systems and measures in place to combat misinformation and disinformation. The bill does three things. Firstly, it imposes core transparency obligations on digital platforms, which will be required to publish a report on a risk assessment of their platform, a current media literacy plan and their policies for tackling misinformation and disinformation. Secondly, it provides ACMA with information-gathering and record-keeping powers to hold digital platforms to account. ACMA can require platforms to maintain and keep records, it can obtain information on an as-needed basis when investigating and it can publish information that's not commercially sensitive. Thirdly, it enables ACMA to approve codes and make standards if voluntary platform efforts provide inadequate protection to prevent and respond to misinformation and disinformation on their services. These codes are industry led and can be approved and registered by ACMA, whilst standards can be set by ACMA as a last resort. A code or standard could include obligations on platforms to have robust systems and processes such as reporting tools, links to authoritative information, support for fact-checkers, and demonetisation of disinformation. Approved codes and standards will be legislative instruments subject to parliamentary scrutiny and disallowable.
I have some concerns about ACMA's ability to fulfil these new roles. This is beyond ACMA's current role. But I understand that additional funding has been allocated and the evolving role can be addressed in an updated statement of expectations from the minister, which is a public document. The bill doesn't apply to all misinformation and disinformation; it only applies if the information causes serious harm. 'Serious harm' can mean harm to public health, vilification of a group and harm to the operation or integrity of an election. Given the bills are proposed to improve the integrity of our electoral system, particularly relating to truth in political advertising, I'm pleased to see this on the list.
In Australian federal elections, the AEC is operating in an information environment increasingly impacted by misinformation and disinformation, which has the serious potential to impact elections, as well as eroding voters' trust in the legitimacy of results. As parliamentarians, we must be doing everything possible in order to restore voters' trust in a time of record disinterest and distrust in our democracy. The Australian public has a high level of trust in the AEC, and the AEC sees the value in considering legislation to increase the transparency and accountability of major digital platforms and their responses to misinformation and disinformation. So that's a good start.
But the Law Council of Australia and other civil society groups are concerned that this bill will amount to a restriction on the freedom of expression, and I agree that any restriction on freedom of expression should not be made lightly. We're back to the balancing act I mentioned before.
I acknowledge that, in drafting, the government has attempted to limit restrictions on freedom of expression. Firstly, ACMA's powers are directed to digital communications platform providers and not to individual end users. These digital communications platform providers will be incentivised to have appropriate systems and measures in place to combat misinformation and disinformation. The bill does not empower ACMA to require digital platforms to take down or remove online content, except for disinformation involving inauthentic behaviour such as bots or troll farms. It does not give ACMA any direct take-down power or the ability to fine or regulate individual end users. Also, various content is excluded from the bill—things like parody or satire content or content for academic, scientific or religious purposes. Professional news content is also excluded, on the justification that it's subject to existing industry oversight.
Changes have also been made since the 2023 exposure draft bill, including narrowing the definition of misinformation and disinformation and the scope of serious harms—both done to reinforce protections to safeguard freedom of expression. These changes have been designed to better align definitions with our international human rights obligations.
I'm encouraged that the government has agreed to make some amendments after listening to concerns raised by the crossbench and other stakeholders. Firstly, the government will now put up an amendment to ensure the three-yearly review is undertaken by an independent person. This was an amendment proposed by the member for North Sydney, and I'm glad the government is accepting it. Given how quickly this area is changing, it will be important that we have a credible and clear-eyed review of whether it's working.
The second amendment that the government is now proposing relates to access to data for researchers so policy can be informed by emerging insights in this rapidly changing space. There's no better disinfectant than sunlight, and giving researchers access to data for appropriate research purposes, along with legal protection, means digital platforms will no longer be a black box. I recognise that the Data Access Framework proposed is based on the EU model, which is still a work in progress, so I'm glad to see that data access for independent researchers will be reviewed in 12 months.
I would like to see more public reporting. As the New South Wales Council for Civil Liberties says in its submission to the legislation committee:
Regulatory transparency ensures that the government can hold platforms accountable, but public transparency allows for a multi-stakeholder approach, where civil society, journalists, advocacy groups, and individuals can play an active role in monitoring misinformation efforts. Without public access to critical data and reports the regulatory process risks becoming opaque, which could diminish public trust in both the government and digital platforms.
But I'm yet to work through the detail of the amendments being proposed to determine how much public reporting is now included directly or indirectly by providing access to researchers.
I note that there have been concerns raised about exempting traditional media outlets from the bill on the basis that there are other regulatory frameworks in place. I accept this, but issues with misinformation and disinformation should be addressed appropriately under those existing frameworks. Given that this is a relatively new area of regulation and a rapidly changing space, I'm glad this legislation will be reviewed every three years to ensure it keeps up with the structural evolution in how people obtain information.
It's worth noting that, while the crossbench has been engaging constructively with experts and the government to improve the legislation, those in opposition are yet again playing a pointscoring game and refusing to even consider how we might work to address this issue, which they have recognised is a problem. Despite previously announcing their intention to introduce legislation to combat harmful misinformation and disinformation online, members of the opposition are refusing to even engage in a constructive way in this debate, and then they have the gall to criticise the crossbench for voting with the government when the government introduces the very amendments that they've negotiated. The pointscoring remains more important to the opposition than the substance.
With these limitations on power and the agreed amendments, I think that there's an acceptable balance between the right to freedom of expression against other rights—although I would be open to supporting additional amendments that put further safeguards in place for freedom of expression, and I look forward to seeing how the bill operates at its first review. In the interests of combating online misinformation and disinformation that causes serious harm, and consistent with the stated desire from my community in our online safety survey, I intend to support the bill in its amended form. I commend this bill to the House.
8:40 pm
Aaron Violi (Casey, Liberal Party) Share this | Link to this | Hansard source
At a time when Australians expect their government to be doing everything in its power to address the cost-of living-crisis that they face, Labor is instead trying to control the public debate. This is one of the most egregious pieces of legislation I've seen in my 2½ years in this place, and the coalition will not be supporting this bill, the Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024. There has undoubtedly been a rise in online harm, whether it's bullying, deepfakes or the abhorrent spreading of footage after violent attacks. As lawmakers, we have a duty to keep Australians safe from online harm, but this must always be balanced with the value of free speech that has underpinned our democracy. Democracy is moved forward when we can debate issues and have opinions heard. No great nation has ever been built by a government that considers itself the arbiter of all truth.
But the challenges we face today in modern society aren't new. They are challenges that have been faced by civilisations for centuries. Let's use one example of a citizen many, many years ago: Galileo. The trial of Galileo comes to mind. Galileo was ordered to turn himself in to the Holy Office to begin trial. What was his crime? His egregious crime was holding the belief that the earth revolves around the sun. This was deemed incorrect by the church at the time. This was the second time that Galileo was interrogated for his strongly-held personal beliefs, despite the fact that it had been known for centuries that the earth was not the centre of the universe. Under this misinformation bill, if Galileo was posting his views online, his views would be deemed misinformation, despite the fact that it was his reasonably-held belief. Even if he didn't intend any harms by his belief, they could still be labelled misinformation and shut down from the public debate. This highlights the delicate balance between combating harmful misinformation and ensuring the freedom to explore and discuss new ideas.
If you try looking to other countries for similar legislation, you won't find it. Countries that are comparable to Australia, including New Zealand, the US and the UK, do not have laws that remotely resemble what the Albanese government is proposing. This is the government's second attempt at ramming through this bill, following overwhelming opposition to its first bill from human rights groups, civil liberties organisations, lawyers and over 20,000 Australians, with submissions overwhelmingly saying that this bill was an attack on free speech and would censor the legitimately-held opinions of Australians.
Those opposite went back to the drawing board. They went through a 10-month consultation period and they've come back into this House with a bill that still attacks free speech and will radically alter the landscape of political communication in this country. To make things worse, the government has limited consultation on this second bill. They knew how bad it was after the Human Rights Commission, lawyers, barristers and Australians had their say on the first round. This time they limited the consultation period to just seven days. Let's think about that.
This is a bill where they're asking the Australian people to trust them. There asking the Australian people to trust the government, build trust in our society, and they give the community seven days to respond, because they didn't like the response in the first, 10-month consultation period. That tells you everything you need to know about this government. They don't really want to listen to the Australian people. It's all about beach houses and flights. This is by far one of the biggest changes to Australian democracy. It's one of the most complex and wide-reaching bills we have seen, with an explanatory memorandum of over 100 pages, and Australians were given seven days to make submissions. Submissions have now closed, which is shameful. It shows that the government knows this is a bad bill, and it shows the contempt the government has for the Australian people.
The lack of consultation on this bill was slammed by the New South Wales Council for Civil Liberties, which said:
… we strongly assert that the decision to allow only seven working days for public submissions on such a critical and complex piece of legislation is incompatible with the principles of transparent governance.
It reminds me of when the Prime Minister talked about a kinder, gentler parliament and then went on to insult every Australian with Tourette syndrome by directly attacking the shadow Treasurer—another example of the hypocrisy of the Prime Minister, who said one thing before the election and has done completely different things since the election.
With this bill giving the Australian Communications and Media Authority, ACMA, the power to determine what is and is not an acceptable statement online, ACMA will determine whether digital platforms are taking adequate steps to prevent and respond to misinformation and disinformation. If they are not, the digital platforms will face fines of up to five per cent of their annual turnover. The digital platforms will want to avoid these fines, so they will censor a large amount of free speech of everyday Australians who want to have their say online. When it comes to digital platforms, the government isn't just talking about what you post on Facebook or Instagram. It extends beyond that to include websites, podcasts, news sites, message boards and search engines—in reality, most of the internet in Australia. ACMA is going to be telling all of these digital platforms how to comply with these new laws. Let's have a look at how they are going to do that.
The explanatory memorandum makes clear:
Digital communications platform providers could be required to use automated processes and technology to detect and act appropriately on misinformation and disinformation … For example, they could be required to use technology or algorithms to 'down rank' or reduce the spread of misinformation …
Algorithms are one of the biggest challenges we face with social media. When I talk to people who aren't very tech savvy about how algorithms work, the best example I can come up with is Netflix. I don't get much time to watch Netflix, but, without looking at who's signed in, you can tell pretty quickly whether it's my station or my kids' or my wife's, depending on what pops up. That's the reality of an algorithm. The Australian people will not even know that they are not being shown the content they want to see. It will be hidden from them before it even becomes possible for them to read that information, get a wide variety of views and make an educated decision.
Clearly, to avoid hefty fines, these platforms are going to begin censoring all kinds of material. It's going to happen on a large scale, with algorithms trained to censor posts and content that fall into the government's loose definition of 'misinformation'. This definition includes statements that are held in good faith. People have opinions and beliefs, but, under this bill, even if you believe something with all your heart, that belief can be labelled 'misinformation'. It doesn't have to be a malicious post. Even if it was not intended to deceive, it can be misinformation under this bill and the platforms will censor the post.
Digital platforms are required to sift through and determine content that may be misleading, and the government has told them, through this bill, that the test that they should apply is whether the post has been fact-checked by a third-party organisation, and what the experts say. We have all experienced fact-check sites that we don't agree with, fact checks that have quite a bit of bias in them, but we apparently have to accept it—although this creates an awkward situation for the government, the Prime Minister and the Treasurer. We hear the Treasurer talk about—I'll quote him because I don't agree with it—the $1 trillion of debt that they were left with. Now, the problem the Treasurer, the Prime Minister and those opposite have when they use that statement is that ABC fact check found that that was not true. The budget papers showed that it was between $517 billion and $535 billion when you look at net debt, and about 35 per cent of that was actually accumulated under the Rudd and Gillard governments.
So we have again the hypocrisy of the Treasurer standing up in question time and blatantly creating misinformation—by the definition in this bill—for the Australian people. I have no doubt that the Treasurer would have seen that fact check from the ABC, but he continues to spread that misinformation—by his own government's definition—to the Australian people today.
Dan Tehan (Wannon, Liberal Party, Shadow Minister for Immigration and Citizenship) Share this | Link to this | Hansard source
Practise what you preach.
Aaron Violi (Casey, Liberal Party) Share this | Link to this | Hansard source
Practise what you preach, Member for Wannon. It's probably because he's got a PhD in political spin that he's so good at it.
To make things worse, academics, scientists and artists are exempt from the bill, but not everyday Australians. So anything an academic or scientist says cannot be censored, but the legitimately held view or opinion of an Australian on that same topic can be censored, which really brings into question who determines who is an academic and who is a scientist. What are the requirements to determine whether you meet those criteria?
An honourable member: Dr Chalmers!
Dr Chalmers—well, yes. And this is a very confusing part and a very concerning part of this bill. The Minister for Communications could personally order misinformation investigations and hearings on terms of her choosing. The minister can also exempt certain platforms, so a digital site that has politics favoured by a government could be excluded from complying with these laws.
But let's use another real-life example of misinformation and disinformation, and how this could play out based on, again, the government's own words. When the now Minister for Home Affairs was the minister for industrial relations, there was quite a lot of legislation that they rammed through this House. There was a lot that was disagreed with by those in the industry, and the minister stood at that dispatch box in question time and labelled that information misinformation and disinformation. The minister stood there and said the arguments used against his own legislation were misinformation and disinformation. Under this legislation, if the minister for industrial relations made that accusation, the Minister for Communications within his own government could personally order a misinformation investigation and hearing because Master Builders Australia and other industry bodies, the Minerals Council of Australia, the BCA and others dared to question the all-powerful minister for industrial relations.
That's the reality of what we're dealing with. That situation could play out because the minister was prepared to use the terms 'misinformation' and 'disinformation' at that dispatch box multiple times in question time when those pieces of legislation were being discussed. Those terms were used at the same time they were looking to ram through the legislation the first time. That is an egregious breach of democracy. Every government should be held to account. Industry bodies have the right to argue their case, but under this government, under this bill, that would be in question—and that's one case study that is a live example.
From public health to politics, the economy and ideology, this bill will impact what Australians are allowed to talk about online. It will limit public discourse and debate and it will diminish democracy. Look out, Australia. First, Labor sent your cost of living through the roof and broke their promises, and now they're trying to limit what you can say about it online. The coalition stands firmly opposed to Labor's misinformation bill because we believe in free speech and always defend that right. In the words of Voltaire, 'I disapprove of what you say, but I will defend to the death your right to say it.'
8:55 pm
Zali Steggall (Warringah, Independent) Share this | Link to this | Hansard source
The scourge of misinformation and disinformation on the internet presents one of the most pervasive and complex challenges of our time. The issue threatens public safety, social cohesion and even our democracy, by enabling mass manipulation and undermining public trust in outcomes. The scale of the problem is staggering and it is scary, because we know a democracy is only as strong as the information available. The fact is you have a situation where misinformation and disinformation is currently in an unchecked environment. It's simply not right.
The World Economic Forum has classified misinformation and disinformation as one of the top global risks for the next two years. That really is an incredibly sobering warning. We only need to look at what's happening daily on social media platforms and, unfortunately, all too often in our legacy media, where misinformation and disinformation is platformed on traditional media—and I will get to that and the issue that this Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024 does not address in a moment.
Unfortunately, you only have to see the outcome today of the US election, where the amount of false facts—misinformation—that has been part of that debate is staggering and incredibly scary for what that means and how it influences outcomes. With a federal election looming in Australia by March or May next year, I am incredibly concerned that Australia is woefully unprepared to counter AI-generated and pushed content that is misinformation and disinformation.
There's ample evidence that misinformation and disinformation is a critical issue facing Australians. In 2022, a Roy Morgan survey showed that over two-thirds of Australian adults have encountered deceptive news misinformation. In a speech to the National Press Club this year, the Australian Federal Police Commissioner—I hope all members of this place can at least take into account the warning from the Australian Federal Police Commissioner—stated that social media companies are 'pouring accelerant' on the flames of misinformation and extremism. If that isn't a sobering warning, I don't know what would get you to pay attention.
You only have to look at some of the practical examples. During the Bondi stabbing attacks and another stabbing incident in Sydney, the spread of graphic and false information and posts, and the speed at which that inflamed the community, was really concerning and demonstrated the impact of misinformation on safety and community cohesion. That's why it's so urgent that we do address this issue.
The government's bill builds on the draft released last year, and there has been a fair amount of consultation. I've been critical of the timing and the ability to provide submissions. I was able to provide a submission to the exposure draft early, and I'll address some of the concerns that I had.
In that submission, I expressed concern about the extensive powers that were being granted to the Australian Communications and Media Authority and concern that there is not sufficient oversight over that body. I'm disappointed that this still has not really been addressed. ACMA's capacity to handle these new responsibilities is still unproven, yet it is a crucial role in how we are going to fight misinformation and disinformation.
There's no independent external review mechanism to ensure accountability of ACMA. I've expressed my concerns that ACMA ultimately reports to the minister and therefore the government. I know that is the area that the opposition has taken to criticise this bill in relation to concerns over how this can be manipulated for political gain. I don't disagree. If there is a change of government, I'm assuming that the advantage of these loopholes and this fault in this legislation would be taken by all sides of government, and that is a huge concern. We need that independent external review of ACMA, and we need to have it at arm's length from the minister and the government. That is an area I'll continue to discuss with the minister. We know that there has been some movement since the exposure draft. Under it, the powers afforded to the minister were capable of being misused, should the political and social environment enable this to happen, and could be used in a way to threaten freedom of speech and democracy.
The bill does provide an impact assessment on freedom of expression, yet this is only set to occur three years after commencement. The review provisions are that there's going to be an assessment after three years of the impact on freedom of speech and the operation of ACMA. My concern is that three years is a long time and that more regular review should be required. I feel the review should be taking place annually, certainly in the early years of operation of this bill, to ensure that it is meeting the purpose intended and that it is doing so in the most effective way. We need to ensure transparency and responsiveness to any unintended consequences. That's why an overly burdensome and earlier review process can be that overprotective, extra layer of certainty. I note from briefings with the minister that there are provisions that ACMA will publish a yearly account of its engagement and action and use of the powers. Whilst that provides some accountability, I still would urge the government to consider a more frequent review than the three years provided at the moment.
The definitions of 'misinformation' and 'disinformation' in the original draft were broad and untested, and that posed a huge risk of misuse. The bill now contains more refined definitions. There's been much discussion between the crossbench and a number of members with the minister. Those changes have been welcome. The Australian Human Rights Commission has called it largely positive but is concerned about significant issues remaining that could impact freedom of expression. I will come back to that in a moment.
I also welcome the adjustments the minister and the department have made in relation to excluded content provisions, now renamed as excluded dissemination, which clarify that these powers don't apply to professional news content. This is a step in the right direction, but further discussion is needed to fully understand and manage the implications of these changes. I should note that, whilst it's essential to have exceptions for news content and we do so under defamation law, the Privacy Act and so many other areas of legislation, those exceptions for news content come with a responsibility to act with ethical diligence in accordance with the codes, and ACMA, which is there as the body to oversee how media and news actually operate, must be an effective cop on the beat.
It must be an effective check against, for example, news and media platforms, including legacy platforms, using those exceptions to be the very conveyors of misinformation and disinformation. We know that's a risk because we saw it in the practical example—unfortunately, the real-life situation—of the Bondi stabbing, where a legacy media company named the wrong person. Nothing in this act would prevent that from happening. Of course, defamation proceedings kicked in and that was resolved, but there is great concern. The hope is that this legislation will stop the misnaming of the person from getting traction and growth and will stop it spreading in online content. That, hopefully, would stop legacy media from jumping onto the misinformation and conveying it further, but it does raise the question: does nothing in this bill address misinformation and disinformation as published and pushed by legacy media companies? That is a step that still needs much work.
Another key amendment that's been introduced concerns access rights for independent researchers to review platform data. This is essential and I do welcome the government moving on this issue. I know many on the crossbench and many other groups have pushed for these amendments. It's incredibly important that online platforms be compelled to produce their data so it can be analysed and used for research and we can actually assess the risks of what is happening online. From eating disorders to online bullying and mental health, there are so many areas where legitimate, independent researchers need to have access rights for platform data. We know the platforms hold that data very closely. They do not share it willingly. Again, this will be managed by ACMA, with an initial review after a year to see how it's going. This is a lot of responsibility to be putting on ACMA, which again raises concern that we need a review of the operations of ACMA to make sure that they are robust, diligent and fit for purpose. This change in relation to independent researchers helps to address the opacity of social media platforms and to promote better research into the very impact of misinformation.
One of the fundamental considerations of this bill is the balance between free speech and public safety. The Human Rights Commission has raised concern that this bill could infringe on free speech rights. But, at the same time, we have to acknowledge that freedom of speech—and I should say that the High Court has acknowledged this—is not the right to spread misinformation and disinformation. There are two very different things in that. The human right to free speech does not override human rights in relation to racial vilification, discrimination and all the other aspects of human rights.
I find it interesting that so many in the opposition are here rallying to the call for human rights, yet no-one is prepared to stand up for a human rights act. Other than the crossbench putting it forward, there has been objection to codifying human rights here and having a specific law to address that. It is always really difficult to find a balance between different rights—the right to be safe, the right to not be racially vilified, the right to not be vilified for sexual orientation. Free speech is not an unlimited right, and that is the balance that has to be struck. I'd say, in the context of the campaign around this legislation, that there have been a lot of conspiracy theories, and we certainly have received a lot of information both ways, but, overwhelmingly, we have to take into account the widespread damage caused by misinformation and disinformation. Ultimately, whilst I have some concerns, for me it highlights the need to implement this legislation.
The bill may not address every aspect of this complex issue. Nevertheless, its immediate implementation could provide a solid foundation to begin tackling these challenges, with improvements to be made through ongoing review. Essentially, I see this as the start of putting up some guardrails against misinformation and disinformation, but I think that much work is going to need to be done and we are going to need to be incredibly diligent to make sure that those guardrails are working effectively.
We can't talk about misinformation and disinformation without talking about something that I have been trying for four years to get all members in this place to sign up to, and that is a standard in political advertising similar to that in consumer advertising. It is absolutely unacceptable that we have yet to legislate against misleading and deceptive political advertising. I have put forward the Commonwealth Electoral Amendment (Stop the Lies) Bill 2022. I have put forward voter protections. Despite the government saying it will act on this, we have not yet seen it, and we are running out of time again, with another election coming. What I say to people is: 'The only way you are protected is you have to inoculate yourself against misinformation and misleading and deceptive content. If it doesn't sound right, if it seems a little bit like an odd claim, it probably is. Check your sources. Make sure you know where to go for fact-checked information.'
The role and authority of ACMA will require careful monitoring in this situation. Historically, ACMA has been partly funded by the media sector it oversees and it has not demonstrated robust accountability. We know that. We've seen that in too many examples.
I've had many a discussion with the minister, who has indicated that a clear statement of ministerial expectation will be made that will set the bar high for ACMA to act. The minister said there will be care taken to make sure that the expanded role meets public expectations of integrity and transparency. So while I have concerns, I commend the bill to the House as an important guardrail against misinformation and disinformation.
9:11 pm
Henry Pike (Bowman, Liberal National Party) Share this | Link to this | Hansard source
It is a late hour and there certainly aren't a lot of government members here to listen to our contributions on the Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024. It's disappointing that we haven't got more government members willing to defend their own bill. It's certainly an important bill. It's an issue that I've probably had more correspondence on than any other that has been before this parliament. It's something that the government should have given a lot more thought to before they introduced this for a second time. It's something that certainly a lot of voters across this country are very concerned about. Certainly, a lot of my constituents and certainly the feedback I've heard from members across the aisle is that this has certainly been a hot issue and one that is worthy of a good, lengthy debate from both sides.
The conventional wisdom once held that the world was flat, that smoking was healthy and that slavery was part of the natural order. People with opposing views were denounced as heretics, radicals and charlatans. Heretical views are often silenced by authority, only to later become accepted facts. History shows us you should never trust government to be the sole arbiter of truth, but I have complete faith in the Australian people to sort fact from fiction. I think that's the significant ideological difference that I would have with the previous speaker, the member for Warringah. I have faith that the Australian people are best placed to sort fact from fiction.
We live in an era of infinite information. It's all available at our fingertips: the good, the bad, the inaccurate, the absurd, the AI generated funny images, the ones that we can't quite tell are real or not. Our freedom to access and analyse this knowledge improves accountability, drives innovation and advances the betterment of society. But I bristle at the thought of a big government information directorate or big tech predigesting this information and spoonfeeding it to the public. There's no algorithm, no regulation and no team of cyber bureaucrats that can sort fact from fiction as effectively as the inquiring and sceptical mind of the average Australian. But this Labor government has a different view on the role of the state and big corporates in controlling information.
This bill seeks to create special powers to tackle online misinformation. The bill will place new censorship burdens on digital platforms and empower the government to obtain information in relation to any breaches. Fines for non-compliance could be as high as five per cent of a digital platform's global turnover. But the problems begin when you look at what would constitute misinformation under this bill. For example, the definition includes any views that might harm public confidence in Australia's banking system or financial markets. My mind goes those who were blowing the whistle about issues to do with those very sectors of our economy in the lead-up to the banking royal commission. It also includes anything that might vilify someone based on their gender identity, and I think about the public outcry over certain participants in the Olympics earlier this year. These are debates that we should be having in this country and I don't think we should be shying away from them.
It also applies to anything that calls into question the efficiency or the efficacy of public health measures. Deputy Speaker Buchholz, you would appreciate that there were certainly a lot of questions to be asked during the recent pandemic, particularly when I think to the Queensland government, which tried to impose a mask mandate for people driving alone within their vehicle. The public should be critical of that. There should be open debate on that and calling into question matters like that should be part of a healthy society. Certainly, I don't think there should be any role for government in terms of trying to curtail criticism; they should be open to the debate.
It doesn't require a lot of imagination to see how easily this broad and subjective definition could be politicised and used to silence dissent. If you recall Orwell's 1984, this feels like a modern twist on 'newspeak'—the language crafted with a limited vocabulary to crush critical thought. I trust Australians to assess the value of information rather than leaving it to faceless bureaucrats or Silicon Valley algorithms. This bill feels like a dangerous step towards Big Brother style control with significant risks of ideological overreach, and, of course, the coalition will not be supporting it.
I take the point made strongly by the member for Casey earlier about the seven days of consultation on this latest version of the bill—seven days for one of the most important pieces of legislation, with huge ramifications, not just for those working within this building but across Australia's society more broadly. The government, obviously, didn't like the massive response that they had to the first round of consultation on this and wanted to limit the days. But I think seven days is wholly inadequate when you consider the size of this bill, the size of the explanatory memorandum and the scale of public interest.
This misinformation bill is one of the most dangerous bills that has come before our parliament in recent memory. Claiming to protect Australians online, it goes way too far and is a shocking attack on free speech. Government should never be the sole arbiter of truth yet, under this legislation, that is what the government is hoping to achieve. There is a list of topics under which misinformation can be considered serious harm under the bill. It includes public health and preventative health measures, imminent harm to the economy or financial markets, referendums and elections. I note some of the commentary that the previous speaker made around misinformation when it comes to elections and I think about the recent Queensland election that we have just been through. I think of an example by the Labor member for Redlands with a huge billboard which outlined that the incoming David Crisafulli government would sell a new satellite hospital that has been built within my electorate—a total fabrication, totally made up, no sense of truth in it at all but still something that is put forward. We were able to use that and counteract that message. This bill, I don't think, provides anything that would have been able to deal with that but it goes to show that, for the moral high ground that some try to take on the other side of the chamber, there is certainly a lot of misinformation that I saw in the course of that campaign.
I have a number of serious concerns with the bill, particularly the unequal treatment of content. One of the previous speakers outlined that academics, scientists, artists, and anything done with parody or satire in mind, are exempt from the bill, or exempt from the definitions of 'misinformation'. But the same exemption doesn't apply for everyday Australians. The same exemption rules apply to anything that is distributed for an artistic or scientific purpose, or things that are said for the purpose of parody or satire, but I wonder about those everyday Australians—many Australians—who are using social media to get involved politically, offer their opinion—honestly held opinion—and what this bill will mean for those individuals getting active within our democracy in that context. Also, the unequal treatment of content in relation to professional news content: if something appears in professional news content—by the definition, effectively mainstream media—it cannot be seen as misinformation. But if that same view or a contrary view was put outside of professional news content, it could be, under this bill, seen as misinformation. This applies to journalists. While their content would not be misinformation if it were within their professional news content, it could be if it were expressed on their personal Facebook page or on Twitter, for instance. I think that's a concerning lack of consistency within the bill.
I'm also concerned that the minister may exempt certain digital platforms. We can certainly foresee a situation where some digital platform may be more favourable to one side of politics or the other, and the fact that the minister, under this bill, may exclude any platform from the operation of the bill opens it up to abuse. This would open up an opportunity for a minister to say, 'One platform, which is favourable to my side of politics, can be excluded, while other ones will be put right through the wringer of this bill.'
It is certainly concerning that the minister, under this bill, would have the ability to personally order misinformation investigations and misinformation hearings. Some good examples were given by previous speakers, so I won't dwell on that. But, when I consider the powers that that would invest in the minister to be able to target dissenting views in relation to any number of policies, I think the ability to personally order that is open to massive abuse and is certainly something we don't want to see in Australia.
The bill would also impose huge fines on digital platforms if the government decides that they have not done enough to prevent or respond to misinformation. This is really my primary concern with this bill. The digital platforms are going to err on the side of pulling off anything that they think might even come close to infringing on these provisions. As is totally understandable from their perspective, they are going to err strongly on the side of limiting the free speech of Australians, and that worries me significantly. We're talking about a scale where individuals can't be making individual calls about whether this or that is misinformation and what's accurate and what isn't; they're going to be relying on algorithms to make those decisions for them, and, to stay on the safe side, they're going to have to err as much as possible on the side of caution.
The communications minister is reportedly warning that there would be devastating consequences should this legislation not be passed by the end of the year. I find this totally alarmist and desperate language. The government has had more than two years to do this already, and I think that—
Henry Pike (Bowman, Liberal National Party) Share this | Link to this | Hansard source
Absolutely. We believe in freedom of speech on this side, and we certainly want to defend rights and we want to put this through the proper process. It certainly shouldn't be rushed through this place or the other.
Let me turn to some of the comments that have been made by stakeholders. There were many thousands of submissions made to the Senate inquiry, and only a fraction of those have been uploaded for public view. We know, of course, that this second take on the bill has only been open for consultation for a week, but we have had some interesting stakeholder feedback on many aspects of this bill. The Victorian Bar association has made a scathing submission, which warned the bill would undermine free speech by encouraging, 'chilling self-censorship' and stifling discussions of 'sensitive or controversial views'. The Australian Human Rights Commission has warned that the proposed law 'does not strike the appropriate balance' with respect to free speech, and it said:
… it also needs to be recognised that information may be opportunistically labelled as 'misinformation' or 'disinformation' to delegitimise alternative opinions, and limit open discussion about issues of public importance.
I think that's incredibly sound advice from the Australian Human Rights Commission on this front. A submission from combined faith leaders said:
… digital providers will be assessing whether the content of a religious belief is reasonable in determining whether or not it is misinformation. This is the same as saying that providers are empowered to determine whether the teaching is reasonable in itself.
Christian Schools Australia are another group that have outlined in their submission their opposition to this bill. They've said:
… social media companies are incentivised to broadly interpret the definition of 'misinformation' and narrowly interpret content that is reasonably disseminated for a religious purpose.
Christian Schools Australia also note:
… social media companies will be able to exercise discretion about how to interpret their obligations and whether content by faith-based schools is reasonable dissemination for a religious purpose.
Disputes under the misinformation bill are ultimately subject to court rulings. This creates the disturbing scenario where a court may determine whether or not a religious belief is reasonable. As outlined by the Australian Catholic Bishops Conference, the bishops have said in relation to this bill:
It also leaves open to a judicial authority to decide what is and is not "reasonable" when it comes to expressing a religious belief, and whether the expression of a religious belief is always for a "religious purpose."
They have gone on to extensively rip apart this bill. There have been a number of other stakeholders who've made similar comments in relation to the religious aspects of this bill and the concerns about what digital providers may or may not interpret as misinformation.
Ultimately, I think this comes down to a significant ideological difference between us and the government. We have faith in the Australian people to sort fact from fiction. We believe the Australian people should be the arbiters of what is truth when it comes to any public discourse in this country. We have faith that they've got the capacity to do that. We do not support this bill. This is a massive overreach. We ask the government to reconsider this. Go back to the drawing board.
9:26 pm
Rebekha Sharkie (Mayo, Centre Alliance) Share this | Link to this | Hansard source
According to a survey on trends in Australian political opinion in 2022, 43 per cent of respondents indicated a high interest in politics in Australia. This level of interest has been consistent for the last 60 years. I mention this as it's a good baseline measure when comparing the number of emails and phone calls my office receives on a particular bill. Non-contentious bills attract low interest commensurate with the non-contentious nature of the bill. Similarly, contentious bills attract a high level of interest. The bill we have before us, the Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024, created a high level of interest beyond what we would normally expect, even for contentious bills.
The response from my community is clear: they do not like what they've read about the bill, and they are concerned about the implications on their freedom to express views and for democracy itself. This, of course, reflects the broader community's view on the bill. The exposure draft received around 20,000 comments and 2,418 public submissions during the consultation period in 2023. This is well above the normal engagement on exposure drafts. Disappointingly, this bill, while amended to address some of the concerns identified in the exposure draft, was limited to a mere seven-day consultation period. For a bill that seeks to impose significant and unprecedented impositions on our freedom of speech, one would have thought a more extensive consultation period would have been prudent.
The genesis of this bill stems from the rapidly changing nature of how Australians create, obtain, engage and distribute news and information. Social media has opened immediate access to sources of information globally, 24 hours a day. However, the digital revolution that has enabled the dissemination of material on platforms accessible on watches, phones and personal computers has created opportunity for the spread of misinformation and disinformation, often facilitated by complex algorithms that target and corral users. Australians are rightly concerned about the spread of misinformation, and they are deeply worried about the potential misuse of social media platforms in elections. But is it possible to legislate a solution? I think that's what we need to be arguing today.
This bill seeks to add a new schedule 9 to the Broadcasting Services Act, the BSA. There are three broad elements. Firstly, it imposes core obligations on digital communications platform providers. Secondly, it empowers the Australian Communications and Media Authority, the ACMA. Thirdly, it defines serious harm as a consequence of misinformation and disinformation, albeit in very broad and subjective terms.
A fundamental tenet of democracy is the right to freedom of expression, acknowledging that we have existing and appropriate laws that put some constraints on this freedom to prevent individuals from invoking violence or discrimination. However, this bill puts an obligation on social media providers who are presented with a complaint to consider the falsity of the information and any serious harm that may arise. This social media provider must then make a decision to either risk civil penalties if its decision is not to remove the post and it is subsequently determined by ACMA to be misinformation and disinformation or take the easy path and simply remove the offending post. Inevitably, social media providers will choose the easy path and remove the post. The consequence is an immediate reduction in the freedom of expression.
There are inherent problems with any attempt to adjudicate what is true information. In the context of ideas, who is right and who is wrong? Many of our great advancements in scientific understanding were considered false before their acceptance. Charles Darwin's On the Origin of Species was considered heretical at the time, only to become the foundation of evolutionary biology theory. Do we want an environment where ideas that don't conform to an established orthodoxy are removed from discussion? An editorial by Michael Sexton in the Australian put this most succinctly. He said:
No doubt some of the statements made on social media stretch credulity but, as American jurist William O. Douglas said in the early 1950s: "When ideas compete in the market for acceptance, full and free discussion exposes the false and they gain few adherents." To similar effect, another American jurist, Oliver Wendell Holmes, said in a judgment of the US Supreme Court in 1919: "The best test of truth is the power of the thought to get itself accepted in the competition of the market." It might be thought that Australians have generally had a history of scepticism for political views and that the optimism of Douglas and Holmes would be borne out in most cases.
The obligations on social media providers and the empowering of ACMA are the stage of the prevention of 'serious harm' which this bill defines as:
A reasonable person would agree that these harms are, indeed, serious. But the practical capture of these harms in the real world poses considerable problems. The Australian Human Rights Commission addressed this in their submission to the Senate Environment and Communications Legislation Committee, arguing that they are too broad, providing the following example:
… legitimate discussion of interest rates may harm any number of Australians confidence in financial markets, especially during times of economic hardship. However, this isn't information that should be captured as causing or contributing to serious harm.
The commission made one recommendation: the bill should not be passed in its current form.
No-one wants to see the spread of false or misleading information, but I don't think people want us to entertain the censorship of ideas either. When governments become the arbiter of truth, we start on a perilous path that must be avoided accordingly. I most certainly will not be supporting this bill.
9:33 pm
Melissa Price (Durack, Liberal Party) Share this | Link to this | Hansard source
I rise today to speak against the Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024. As many of my esteemed colleagues have identified throughout this debate, this legislation presents a serious threat to Australians' freedom of speech. At a time when Australians are living in a cost-of-living crisis, it beggars belief that the Albanese government would think that this proposed bill would be a priority. If you ever need evidence of how out of touch this government is—and of their warped sense of priorities—you need look no further than this bill before us this evening.
The bill before us provides the Australian Media and Communications Authority, known as ACMA, with powers to require digital platforms to take specific steps to reduce misinformation and disinformation. If ACMA determines that a platform is not taking adequate steps, they can impose fines equivalent to five per cent of a company's global turnover.
Under Labor's plan, something can be misinformation even if it is the honestly held opinion of an Australian. Such a statement doesn't have to be malicious or designed to deceive. This can include unintentionally misleading statements about elections, referendums, the economy or the stock market. There are also some exceptions in this legislation. Exceptions apply for academics, scientists, artists and even comedians although I'm not entirely sure how we define what a 'comedian' is. But the views of everyday Australians—well, they are captured under the government's planned legislation and they receive no such exemption. The practical effect of this bill will be widespread censorship. This should be obvious, as the digital platforms will want to avoid these fines which could be, as I've said, up to five per cent of their global turnover.
Another concern of mine is that the communications minister can personally order investigations and hearings into what the government decides is misinformation. This is quite clearly open to abuse and, in my opinion, will see diverse voices censored. So, what would the minister have done with that power last year? I ask you to consider that.
As I've just mentioned, referendums are one of the topics identified in the bill whereby misinformation can constitute serious harm. Time and time again, those opposite labelled questions or criticisms about Labor's Voice to Parliament model as 'misinformation'. The brains trust behind the referendum—we know the crew: the Prime Minister, the Treasurer, the Attorney-General and the then Minister for Indigenous Australians—all came into this place and labelled any dissent to the Voice as 'misinformation'. If you don't believe me, it's in Hansard. A month before the referendum, the Treasurer said:
When you strip away all of the conspiracy theories and all the mistruths and you look through the fog of misinformation and manipulation, this opportunity is really clear and really important to our country and to all of its people.
There are still people out there who believe that the Voice referendum was rejected in every state because of misinformation. They refuse to accept that everyday Australians came to their own conclusions and voted no simply because the Voice was a bad and divisive proposal.
I believe this is a very big motivator for why combating so-called misinformation is a priority for Labor. They believe that uncensored debate is harmful as it stands in the way of what they deem to be progress. We know that it's not because Labor are committed to the truth. Let's not forget Labor's guilty track record of spreading untruths. Who could forget their 2016 'Mediscare' campaign or their 2022 campaign scaring pensioners into thinking they would be placed on the cashless debit card? In both of those instances, Labor didn't believe those mistruths that they were spreading. If Labor are seriously worried about the harm caused by misinformation, why do they consistently run on it? We know they plan to do it again. We've already seen the memes unleashed of three-eyed fish in response to the coalition's proposed nuclear policy. It's a classic policy of do as I say, not as I do.
While I mentioned the Voice, it's clearly not just the Voice, because misinformation has become a left-wing catch-all phrase for everything they don't like to hear. Put simply, if you agree with Labor, well, it's all good. If you have a different opinion, then guess what? That must be misinformation! In July, the WA Labor member for Hasluck said in a speech to the House in reference to the live sheep trade debate:
Unfortunately, there has been a lot of rhetoric and a lot of misinformation, not just from those opposite, but by leaders within the farming bodies such as the National Farmers' Federation …
This is just another example of dissent being branded as 'misinformation'. See where we're going here? I'm very concerned that this is a very slippery slope, and I'm sure a lot of Australians would be shocked to know that this is even possible. Unlike the US, we don't have a constitutional provision enshrining the right to freedom of speech. The debate around a bill of rights here will no doubt continue. But I will just say that the Americans got it right in providing a free speech protection in their First Amendment. It makes sense that it would be first, as, without free speech, the other rights are in ever-present danger of being taken away.
I'm not suggesting for one minute that there isn't troubling content out there, particularly in the social media world. Personally, I think we would all be better off as a society if we spent less time online and more time in the real world—I would say less time online and more time out in the great electorate of Durack, to be more specific. However, when we're talking about troubling speech, the traditional approach has always been that the way to combat bad speech is with more speech. Labor is abandoning this approach in favour of mass censorship. So mark my words: this will lead to a further decline in trust for government and, of course, leave many of those censored feeling vindicated and only further commit them to their cause. Just think of the old saying, 'If they're coming after you, you must be doing something right.'
These are issues that really should have been considered by the minister, given that this bill had a long holiday. As we know, there was a 2023 bill, which was even more extreme than the one before us this evening. In response to that version of the bill, the government received more than 20,000 submissions and other responses strongly opposing it. Groups including the Human Rights Commission, civil liberty bodies, the Australian Law Commission and religious institutions deeply criticised the draft legislation. Unfortunately, after binning the previous bill, they've now brought it back. Mr Speaker, I can tell you I have heard from many, many constituents right across Durack that they deeply oppose this bill. They share my concerns that this radical bill represents an unacceptable attack on the freedom of speech. Constituents have raised with me that they are concerned by the vague definitions of 'misinformation' and 'serious harm' and fear this could easily lead to overcensorship, which will stifle legitimate discussion and debate. I'm certain that this isn't just Liberal and National members receiving this feedback from our communities; I'm quite sure that those sitting opposite have also been receiving the same lack of appreciation for the bill before us this evening.
Given the former version of this bill received tens of thousands of submissions and the considerable feedback we, as members of parliament, are receiving, you might have thought that the government would establish a proper community consultation process. Unfortunately not. The government provided just seven days for submissions to be made to the committee looking at this bill—just seven days for a bill that is more than 70 pages long and has an explanatory memorandum that is over 140 pages long. Really? What is this government running from? Just seven days? What's it hiding? Despite such a short time for public submissions, to make it clear, the idea remains friendless. Already, we've seen the Victorian Bar association, the New South Wales Solicitor-General and the Queensland Council for Civil Liberties slam this bill, to name just a few.
I'll use this opportunity to give attention to some of this criticism. New South Wales Solicitor-General Michael Sexton was quoted in the Australian saying that this bill:
… targets contestable political opinions on social media and is based on the patronising assumption that members of the community cannot make a judgment about those opinions but must be protected from the obvious inadequacies of their judgment.
The Australian Christian Lobby has said:
There is no excuse for what's proposed in this bill.
… … …
Where the government should be safeguarding the free speech of Australians, it will instead require social media to control our public discourse. From public health to politics to the economy and ideology, how this bill defines harm will determine what you are allowed to say online.
One of the areas open to censorship under this bill is public health. Professor Nick Coatsworth, former deputy chief health officer, was one of the lucky few that got an opportunity to make a submission. Professor Coatsworth's comments included the following:
The terms "misinformation" and "disinformation" have become overused in public discourse, often employed as a way to dismiss opposing viewpoints without engaging in debate. In an era where limited attention spans hinder reasoned discussion, these terms have become shortcuts to shutting down conversation
… … …
Rather than seeking to impose the truth upon the public through legislation, we must focus on equipping our communities with the tools to critically assess and judge information for themselves.
Quote such as these speak for themselves and clearly identify that this is just another antidemocratic and nanny state action taken by those opposite.
In wrapping up, I will not be supporting this bill. Freedom of speech is fundamental to our democratic society, and providing for widespread censorship is, quite frankly, un-Australian and dangerous. As French writer and philosopher Voltaire said, 'I disapprove of what you say, but I will defend to the death your right to say it.' I urge everyone across the chamber to vote against this legislation.
9:46 pm
Allegra Spender (Wentworth, Independent) Share this | Link to this | Hansard source
Misinformation and disinformation cost our social cohesion and democracy, and there's clearly a need to address it. Earlier this year in my electorate of Wentworth a lone attacker entered Bondi Junction Westfield and indiscriminately attacked and brutally killed six people. In the wake of that attack, a student from the University of Technology, Sydney, was incorrectly identified as the perpetrator on platform X, which precipitated abuse, threats of violence, hate speech, and racial slurs those against him and the Jewish community, of which is a member. This August in the United Kingdom, anti-immigration riots erupted when misinformation was spread following the tragic death of three young girls at a dance school in north-west England.
There is clearly a cost to allowing harmful misinformation and disinformation to propagate and spread unchecked on social media. It can have significant and sometimes tragic consequences. But fighting misinformation and disinformation can also have real costs. We are in a time of low trust in government, low trust in media. We have only recently emerged from a pandemic which imposed restrictions on our community at levels never seen before. We are facing, too, deep divisions and a battle of narratives and facts regarding a war in the Middle East. In this context, real and perceived restrictions to freedom of expression and ideas, and restrictions on contesting ideas and facts, have the potential to undermine trust in government, trust in our institutions and trust in our society. In doing so, it perhaps makes people even more vulnerable to true misinformation and disinformation. This is the fraught context in which this bill is being debated.
This bill will empower ACMA to require and enforce industry developed codes relating to the treatment of misinformation and disinformation on digital media platforms. ACMA will be responsible for approving codes and standards and will be able to determine if codes are suitable. Where industry codes are not sufficient, ACMA will be able to impose codes on companies. This bill provides guidance for assessing the threat of misinformation and disinformation, and the misinformation content will need to meet four criteria. These criteria include, most critically, that misinformation can be reasonably verified as false and that it is reasonably likely to cause harm or contribute to serious harm. Disinformation is differentiated from misinformation by additional condition of inauthentic behaviour, which is widely understood to be the dissemination of bots. ACMA will not be able to remove specific content, nor will it be able to take action against specific individuals for producing content. It will, however, be able to enforce civil penalties on media companies for noncompliance with the codes to reasonably prevent misinformation and disinformation.
Measures to prevent the spread of information at the source will clearly impact freedom of speech and expression, and, while I acknowledge freedom of speech is not absolute, as it stands I'm not yet convinced that this bill is the correct approach. There are substantive issues that have been raised about the bill, particularly around the potential for this bill to limit freedom of expression, and this is of great concern to me.
The first of these issues is the definition of 'verifiably false'. While this may be clear-cut in the majority of content, it ignores the nuance that what is considered true and false may vary over time and also by the interpreter of information. As a special rapporteur noted, it is difficult to classify all types of information into the binary analysis of true and false. As one of my constituents observed in one of their emails to me, experts sometimes get it wrong. Even members of my constituency who have written to me about this bill have noted, for instance, that, while they agree with many of the public health notices that came out during COVID, they did believe that it was important there was a debate about what was true and false and important for that debate to be public and allowed to flourish.
Secondly, while there are definitions of 'misinformation' and 'disinformation', there is no clear definition of the types of information that will be considered. This leaves open ambiguity or, at least, a presumption that all information posted by an individual, regardless of purpose, such as commentary or opinion, may be determined to be misinformation. This bill addresses, through exemptions, some of the most important examples, such as professional, news, satire and academia, but it does not clarify on the more fundamental question of content posted by ordinary Australians.
Thirdly, perhaps the most controversial is the harm threshold. For content to qualify as misinformation and disinformation it must be reasonably likely to cause harm. Some stakeholders, including the Human Rights Commission, believe it is too low a threshold for determining misinformation. However, I'm more concerned by the six discrete categories of harm that will be treated with greater scrutiny, including election interference, public health, vilification, physical harm to an individual, damage to infrastructure and harm to the economy. While some of these may not be controversial, I do have concerns—and so have many of my constituents—with the restrictions on the discussion dissemination of matters relating to public health and the economy, in particular. Misinformation was certainly a problem during the pandemic, but part of this bill raises more concerns from my constituents, as I mentioned before, even those who actually agreed with the public health information coming from the government. Ignazio and Luke, two of my constituents, are concerned that these powers could silence genuine and valid critics of public health measures, including whistleblowers, and prevent proper scrutiny of corporations, such as banks, that have a significant impact on the economy.
The other area that I think is really important to explore is the unintended consequences of this bill. The most concerning unintended consequence of this bill is that the penalties that can be imposed on digital media companies for breaches of the code could result in an overly cautious approach being taken to publishing content. If this approach is taken, this may mean that they overly censor their work to avoid falling afoul of the regulations put forward in this bill.
I understand that the government has tried to thread a needle here in terms of effectiveness and proportionality. And I note that the community of digital, legal and human experts are divided on the bill. For example, I note that the Human Rights Law Centre believes that the restrictions on freedom of expression are proportionate to the intentions of this bill. This may be true, but it's not just a matter of these issues that there was proportionate. And I'll also note that the Human Rights Commission itself is not a supporter of the bill. This is a very contested piece of legislation
So what are the alternatives here? I think that we, perhaps, in this situation, may be putting the cart before the horse. Can we really say with confidence that there are no other ways of moving forward in our objective in terms of suppressing and stopping the spread of misinformation and disinformation while, at the same time, preserving trust in the system? Are there better ways that we could approach this here?
First and foremost, I think transparency is key. I know that many people have observed that, in this space, social media companies are already taking actions against misinformation and disinformation. This is correct—and that is already a threat, I think, to our democracy and our ability to exchange ideas. I welcome, particularly, the amendments put forward by the member for Goldstein that seek to create greater transparency over what is being done through the algorithms and the actions of the social media companies in this space, and particularly to give researchers access to this information from digital media companies. If we start with, frankly, that greater scrutiny on what actions are already being taken—how misinformation and disinformation are already being addressed—that will be a really important part.
In addition to this transparency, which could be the first building block of better addressing misinformation and disinformation, there are some other safeguards that are being employed, perhaps, in other jurisdictions, that I think could be potentially useful in this space. There is scope to bring in content warnings that could indicate misinformation or identify where claims cannot be verified. Emerging literature is showing that these are types of warning labels, and they certainly have been used in the past in public areas, and certainly some of the students I spoke to and consulted in my response to this bill highlighted how useful some of those warning labels were—effectively, labels to say, 'Hey, check the facts on this one.' This could be legislated on media companies without discouraging the publishing of material or limiting the ability of a consumer to view the material and make up their own mind. Similarly, several jurisdictions, including the US and the UK, are exploring the use of watermarks on AI content, or, at the very least, provisions on AI providers to enable detection and tracing of AI content. Again, I think these are some really positive steps. It will require some coordination, but I believe that these could be quite effective measures to deal with misinformation and disinformation without limiting the power of AI, but also without some of the potential restrictions and some of the concerns that the community will have—and do have and have certainly expressed to me—if they feel that legitimate debates are not being allowed to be prosecuted on the basis of misinformation and disinformation.
When I spoke to members of my community about this, I spoke particularly to young people: one group of high school students and one group of people between high school and university and starting work, and all of these students identified the challenges of misinformation and disinformation. However, they all came back to me and said that perhaps the strongest and best approach in this area is actually education and critical thinking—'How do we better equip people to critically assess the information that is in front of them?'—as well as having some of the warnings on content and those pieces. That was certainly what they suggested should be our first area of action.
So I come back to where I started on this bill, which is to say: I recognise that misinformation and disinformation are significant threats to our country and to our society and to our social cohesion, and I take those threats extremely seriously. However, I believe that restrictions on content—real but also perceived restrictions on content—are also a threat to our ability as a society to trust our government, our society and the media and other things. I do think that poses a very significant threat to our country as well. This is a situation where we have to get the balance right and, I'm afraid, with the contested perspective on this bill, at this stage, with the bill in its current form, I don't think the bill has got that balance right.
Milton Dick (Speaker) Share this | Link to this | Hansard source
The question is that the bill be now read a second time. I give the call to the honourable member for Hinkler—
Milton Dick (Speaker) Share this | Link to this | Hansard source
A point of order from the member for Hinkler.
Keith Pitt (Hinkler, National Party) Share this | Link to this | Hansard source
Given that we're very close to the 10 o'clock cut off, Mr Speaker, I wonder whether you'd consider the issues surrounding the staff, who I'm sure are due to go home, and perhaps we could recommence in the morning. I know this is a significant matter that needs to be debated in the very near future, but, as we approach 10 o'clock, I think that's probably appropriate, given that the lighting also needs to be considered at this time of night. Mr Speaker, I'm sure I can leave it in your able hands to deal with the matter.
Milton Dick (Speaker) Share this | Link to this | Hansard source
I can always rely on the member for Hinkler for wise advice. In accordance with the resolution agreed to earlier, debate is adjourned and the resumption of the debate will be made an order of the day for the next sitting. The House stands adjourned until 9 am tomorrow.
House adjourned at 22 :00