Senate debates

Tuesday, 1 August 2023

Committees

Foreign Interference through Social Media — Select Committee; Report

4:46 pm

Photo of James PatersonJames Paterson (Victoria, Liberal Party, Shadow Minister for Cyber Security) Share this | Hansard source

I present the final report of the Select Committee on Foreign Interference through Social Media, together with the accompanying documents. I move:

That the Senate take note of the report.

I am pleased to present the report from the Senate Select Committee on Foreign Interference through Social Media examining the risk to Australia's democracy and values by foreign interference through social media, including via the spread of disinformation. At the outset, I want to thank my fellow committee members, in particular the deputy chair, Senator Walsh, for their constructive and bipartisan collaboration on what is one of Australia's most pressing security challenges. I'd also like to acknowledge Senator McAllister and our late colleague Senator Molan, who first led the inquiry into foreign interference through social media in the 46th Parliament.

Foreign interference is now Australia's principal national security concern. It is pervasive, insidious and subtle and has the potential to undermine our values, freedoms and way of life. Australia has led the world in combating foreign interference and has moved quickly to counter the threat through reforms, including the introduction of the Foreign Influence Transparency Scheme in 2018 and the Foreign Arrangements Scheme in 2020, and the decision to exclude Huawei from our 5G network in 2018. However, foreign interference tactics have continued to evolve along with technology, and the threat has only increased in the worsening strategic environment we find ourselves in.

Authoritarian regimes continue to threaten democratic societies through targeted disinformation campaigns that use social media to advance their strategic interests. Perpetrating states use these platforms to skew public debate, undermine trust in public institutions and peddle false narratives. The proliferation of emerging technologies like artificial intelligence has made their jobs easier, dramatically increasing the scale and reach of foreign interference campaigns. This calls for urgent action to ensure Australia stays ahead of the threat.

Social media platforms are the new town square in liberal democracies. It was estimated in February last year that some 21.45 million Australians are active social media users, and more than half of all Australians use social media as a news source. Social media is a place where news is reported, contentious issues are debated, consensus is formed and public policy decisions are shaped. The health of these forums directly affects the health of our nation. Foreign authoritarian states know this. They do not permit free debate on their own social media platforms. They use ours as vectors for information operations to shape our decision-making in their national interests at the expense of our own. As ASIO assessed, social media itself is not the threat, but it serves as a vector for foreign interference.

Not all social media platforms are the same. The ability for a social media platform to be weaponised varies according to the laws of the country where it is headquartered.

The committee was particularly concerned by the unique national security risk posed by companies like TikTok and WeChat whose parent companies, ByteDance and Tencent respectively, are headquartered in China. China's 2017 national intelligence law means the Chinese government can compel these companies to secretly cooperate with Chinese intelligence agencies. The committee heard that TikTok's China based employees can and have accessed Australian user data and could even manipulate algorithms dictating what Australians could see on the platform. But TikTok cannot tell us how often Australian data is accessed, despite suggesting that this information was logged. Nor was TikTok able to provide a legal basis on which employees would refuse to comply with Chinese law. The short answer is they can't.

Throughout the inquiry companies headquartered in authoritarian countries were consistently reluctant to cooperate with Australian parliamentary processes. TikTok were hesitant to provide witnesses sought by the committee and were evasive in their answers when they finally did agree to appear. WeChat showed contempt for the parliament by refusing to appear at all and through disingenuous answers it provided to questions in writing. The representations made by the Chinese embassy to the Department of Foreign Affairs and Trade about our inquiry on WeChat's behalf only served to prove the point about the close relationship these platforms have with the Chinese government.

This stood in contrast to the more constructive engagement the committee had with platforms based in Western countries who at least recognised the fundamental importance of checks and balances inherent in democratic systems, despite the impost this can create. These companies are facing novel challenges in combatting foreign interference as authoritarian regimes continue to pump disinformation on their platforms. Between 2017 and 2022 Facebook's parent company, Meta, disabled more than 200 covert influence operations originating from more than 60 countries that targeted domestic debate in another country. In the first quarter of 2023 YouTube terminated more than 900 channels linked to Russia and more than 18,000 linked to China.

In the case of both authoritarian and Western based platforms the committee explored concerns platforms are being used to both pull and push information to gather intelligence on individuals that will enable them to be targeted; to gather behavioural data by population or cohort; to refine interference campaigns; to harass and intimidate Australia's diaspora communities; and to undermine societal trust, spread disunity and influence decision-making. Countering this has become more complex as authoritarian regimes evolve their methods, artificial intelligence and commercialisation of disinformation services where state actors engage companies to orchestrate disinformation campaigns and threaten to exponentially increase the scale and reach of foreign interference through social media.

It is crucial that Australia develops a real-time capability to counter this malign activity. For two reasons this approach should be underpinned by a guiding principle of transparency rather than censorship. The first is to expose disinformation activity, which thrives off secrecy. The second is to empower Australians so that they can both evaluate the content they see on platforms and the conduct of the platforms themselves. For example, state affiliated media entities should be proactively labelled on all platforms; any content censored at the direction of a government should be disclosed to users; platforms should be open to independent external researchers who can investigate and attribute coordinated inauthentic behaviour; and the access to user data, especially by employees based in authoritarian countries, must be disclosed.

WeChat comprehensively failed the transparency test by refusing to participate in public hearings on the basis that, despite its significant digital presence, it does not have a legal presence in Australia. If social media companies want to operate in Australia, they should be required to establish a presence within Australia's legal jurisdiction to be more effectively held accountable under our laws. The committee found that TikTok engaged in a determined effort to avoid answering basic questions about its platform, its parent company, ByteDance, and its relationship with the Chinese Communist Party. We recommend that companies that repeatedly fail to meet the minimum transparency requirements should be subject to fines and, as a last resort, may be banned by the Minister for Home Affairs with appropriate oversight mechanisms in place. Should the US government force ByteDance to divest ownership of TikTok to another company that is not beholden to the Chinese Communist Party, the Australian government should consider similar action.

The April 2023 TikTok ban on government issued devices due to serious espionage and data security risks should also apply to government contractors and entities designated as systems of national significance. We must move beyond the whack-a-mole approach to assess and mitigate the next TikTok before it is widely deployed on government devices.

Amended Magnitsky-style sanctions, greater enforcement of espionage and foreign interference offences, and support for diaspora communities targeted by transnational oppression all need to be part of a package of reforms to make Australia a harder target.

Despite Australia's world-leading efforts to counter foreign interference, evolutions in technology and the threat environment demonstrate that there is more work to be done to protect Australia from the sophisticated disinformation campaigns of authoritarian regimes. With a concerted effort by government, the private sector and civil society, we can ensure that Australia's way of life prevails and preserve the extensive benefits that social media platforms provide, while managing the accompanying risks. I commend the report to the Senate.

Question agreed to.

Comments

No comments