House debates
Monday, 11 September 2023
Private Members' Business
Freedom of Speech
5:41 pm
Kate Thwaites (Jagajaga, Australian Labor Party) Share this | Hansard source
Misinformation and disinformation are real, and they can have serious consequences for the safety and wellbeing of Australians. They can be seriously harmful. They can be designed to sow division within our communities. They can be designed to undermine our trust in each other, in our public institutions, in our organisations and even in our democracy. Misinformation and disinformation can be designed to threaten public health and safety. So it is something that I am concerned about. It is something that this government is, rightly, concerned about. And this is why we plan to tackle this threat of misinformation and disinformation with responsible measures that will make a difference.
Australians do know that there is misinformation and disinformation out there being spread by algorithms, largely through social media, and we have seen that in the context of the pandemic and others over the previous years. A recent University of Canberra study found that 66 per cent of people said they encountered misinformation on social media about COVID-19. Twenty-three per cent encountered what they called a lot of misinformation, 36 per cent encountered some misinformation and 30 per cent of people were forwarded misinformation from someone they know—false and misleading information about treatments and how to prevent exposure, information that was designed to sow discord, that was designed to mislead people.
We know that misinformation is also a problem in the area of national security, where malicious actors are using digital disinformation to infiltrate and influence public discourse. And this has been found in our select committee on foreign interference, which tabled its report on what risks Australia faces in this area. It highlighted that regimes continue to pose an unacceptable risk through targeted online misinformation campaigns that leverage social media platforms to skew public debate, undermine trust in our democratic institutions and establish narratives that favour the interests of authoritarian states. The report went on to note that the growth in technologies, including artificial intelligence, is making it easier and easier to conduct misinformation campaigns.
Social media platforms themselves have recognised that there is a threat from misinformation and disinformation. In Australia, platforms have signed up to a voluntary Australian code of practice on disinformation and misinformation. The code commits its signatories to implement safeguards that limit the spread of misinformation and disinformation on their platforms and to report annually on their commitment to this work. This is a good start, and it has been recognised as such. Industry bodies have also said that they want more from government; they do want government to regulate in this important area.
Essentially, we have a choice. We can have legislation from government; we can have involvement from a democratically elected government in the space, or we can leave this space entirely to tech billionaires and foreign malicious actors. That is essentially what we're talking about here. Our government has identified that we need to do more. The spread and influence of misinformation and disinformation by people is something that we need to act on now. Australians do need to be protected from the seriously harmful content that can be spread online, and we can do this as a federal government.
The exposure draft of this bill builds on the voluntary code already in place by boosting ACMA's ability to hold digital platforms to account. ACMA will have new information gathering powers to improve transparency around what platforms are doing to combat misinformation and disinformation—content that is false, misleading or deceptive that causes or can contribute to serious harm to Australians.
ACMA would be able to register enforceable industry codes with penalties for noncompliance. ACMA would have the power to require their industry to lift the bar where systemic issues amongst platforms are identified, and the standards used could include measures like stronger tools to support users to report misinformation and disinformation, more robust complaints handling and enabling the more extensive use of fact checkers.
It is important to note that the legislation does not give ACMA the power to remove or deal with individual pieces of information. This is about transparency and systems.
We saw that the former government previously committed to empowering ACMA to take action on misinformation and disinformation, yet, now they are opposing it. They have decided, it seems, that because they are in opposition they no longer need to take these very serious threats seriously. The Liberals and Nationals now appear to be unfazed by the need to protect Australians from harmful information.
Industry knows we need to take action. The Australian community is looking to us for action. It is incredibly important for our democracy, for our institutions and for the conversations we have as a country that we do address the threat presented by misinformation and disinformation, and that is what the government is attempting to do.
No comments