Senate debates
Wednesday, 24 February 2021
Bills
Treasury Laws Amendment (News Media and Digital Platforms Mandatory Bargaining Code) Bill 2021; In Committee
12:35 pm
Rex Patrick (SA, Independent) Share this | Hansard source
by leave—I move amendments (1) to (3) on sheet 1197 revised, circulated in my name, together:
(1) Schedule 1, item 1, page 24 (after line 3), after Division 5, insert:
Division 5A—Audits and reviews of designated digital platform services
52ZCA Audits of algorithms and practices of designated digital platform services
Carrying out audits
(1) The Commission may carry out an audit of the operation of the algorithms and internal practices of a designated digital platform service in relation to:
(a) the impacts of their operation on access to, and availability of, covered news content; and
(b) whether their operation causes, or is likely to cause, differentiation of a kind referred to in Division 5 or other anti-competitive or unfair results in relation to the accessibility and availability of covered news content.
(2) Without limiting subsection (1), the audit may include the operation of the designated digital platform service in relation to:
(a) crawling, indexing, making available and distributing news businesses' covered news content; and
(b) referral traffic from the service to news businesses' covered news content; and
(c) the distribution of advertising directly associated with news businesses' covered news content made available by the service.
(3) The Commission must carry out at least one audit of each designated digital platform service during each financial year.
(4) An audit must be carried out by members of the staff of the Commission.
Requirement to produce documents, records or things and provide assistance
(5) A person (the auditor) who is carrying out an audit under this section may require the responsible digital platform corporation for the designated digital platform service:
(a) to produce any documents, records or things that the auditor is satisfied are relevant to the audit; or
(b) to provide the auditor with all reasonable facilities and assistance for the effective exercise of powers under this Division.
(6) A responsible digital platform corporation must comply with a requirement under subsection (5).
Other matters
(7) If there is more than one responsible digital platform corporation for the designated digital platform service, the obligations in this section apply to each of those responsible digital platform corporations separately.
(8) This section does not limit section 155 (which is about the general information-gathering powers of the Commission) or other powers of the Commission under this Act.
52ZCB Annual reviews and public reporting on algorithms and practices of designated digital platform services
(1) The Commission must review, and report each financial year on, the operation of the algorithms and internal practices of designated digital platform services, and their impact on the accessibility and availability of covered news content.
(2) Without limiting subsection (1), a report under that subsection must include:
(a) a summary of the audits the Commission has undertaken under section 52ZCA during the financial year, including the findings of those audits; and
(b) the Commission's assessment of whether the Commission's auditing and review activities have indicated any evidence in relation to digital platform services of anti-competitive activity or practices that unfairly limit or distort access to covered news content.
Report must not include trade secrets
(3) The Commission must not include in a report under subsection (1) information the disclosure of which would reveal a trade secret of a responsible digital platform corporation.
(4) Reports under subsection (1) must also comply with such requirements in relation to the protection of confidential information as are specified in a determination made by the Minister for the purposes of this subsection. For this purpose, information is confidential information if, and only if, the publication of the information could reasonably be expected to prejudice substantially the commercial interests of a person.
Opportunity to comment on proposed report before finalising
(5) After preparing a proposed report under subsection (1), the Commission must give to the responsible digital platform corporation for each designated digital platform service covered by the report either:
(a) a copy of the proposed report; or
(b) extracts of the parts of the proposed report that relate to the responsible digital platform corporation.
(6) If the recipient of the proposed report, or extracts from the proposed report, gives written comments to the Commission within 28 days after receiving the proposed report or extract, the Commission must:
(a) consider those comments before preparing the final report; and
(b) include in the final report the written comments received.
Timing and publication of final report
(7) The Commission must:
(a) give a report under subsection (1) to the Minister as soon as practicable and no later than 6months after the end of the financial year concerned; and
(b) publish the report on its website.
(8) The Minister must cause a copy of a report under subsection (1) to be laid before each House of the Parliament within 15 sitting days of that House after receiving the report.
(2) Schedule 1, item 8, page 52 (line 19), after "52ZC,", insert "52ZCA,".
(3) Schedule 1, item 10, page 53 (after line 12), after paragraph (4A) (d), insert:
(da) section 52ZCA;
It remains to be seen whether this legislation will remedy the imbalance of market power between the big digital platforms that dominate our information landscape and the media organisations that provide us with the news and reporting which are a vital underpinning of our democracy. In some senses I think greater uncertainty has been brought about by the government's amendments this morning. It's clear that the bill addresses only a small part of a range of measures that are needed to improve media diversity in Australia and to further support public interest journalism. It will also address other major issues, including the international tax practices of the digital giants that operate across the globe.
These amendments deal with yet another aspect of the situation—the opaque nature of the algorithms and business practices of big tech. A fairly new area of engineering, which I know might be foreign to those present in the chamber, is where independent auditors look at algorithms—it can be algorithms in respect of commercial applications or algorithms used by government—to make sure that those algorithms are performing the functions that they ought to perform and, indeed, not introducing unintended consequences into what are automated processes. Not much needs to be said, for example, about robodebt and the way in which that algorithm did not perform properly. Had there been algorithm auditing, the government may have been in a substantially better financial position than it is in now after having to make payouts to people who, it turns out, robodebt robbed of money.
We've also seen in recent events that the digital platforms are operating with impunity under a shroud of near-total secrecy. Their anticompetitive practices and how their algorithms operate are shrouded from any oversight or accountability. This secrecy has deep implications for our democracy. These platforms have unquestionably amplified the spread of misinformation and falsehood aimed at undermining public confidence in, for example, the COVID-19 health response. They have undermined trust in democratic institutions and supercharged political polarisation. Even during debate on this bill, Google's public campaigning involved actively limiting the information that 250,000 Australians could access, through their so-called experiments. Similarly, Facebook's arbitrary decision to ban news sites extended to government, political and community organisation pages, amply demonstrating the outsized impact that the company's algorithms have on the way we do business here. This has implications far beyond those of the news bargaining code, and in order to deal with them we do need to have a level of scrutiny.
The ACCC actually has the power to go and look at the activities and, indeed, the algorithms of companies—that was confirmed by Mr Sims at the committee inquiry—and the intent of this amendment is to make sure that the ACCC does so. Whilst they have the power now, there is no requirement for them to do algorithm audits. There is no requirement to go in and have a look at what Google and Facebook might be doing, what their algorithms are actually doing, and make sure there is no anticompetitive behaviour being implemented through those algorithms, either intentionally—and one would hope that doesn't occur—or even unintentionally. That would be consistent with the role of the ACCC.
The problem we have here and that this parliament will need to come to grips with moving forward—it is a relatively new field in engineering—is in auditing algorithms independently to make sure they are doing what they are promoted as doing. It's something we will have to take notice of and deal with in the future, not necessarily just for this particular bill. We need to open our minds to the fact that auditing is going to be required right across society where algorithms are being employed. It's no different to the auditing of books. In order to establish confidence in our corporations, we require them to be audited. That gives public confidence and it gives confidence to shareholders. This is no different. It's simply saying: let's audit the algorithms.
It's very clear in my amendment that if the ACCC were to go and do an audit they would not be allowed to reveal trade secrets. They'd simply examine the code, make sure there was nothing in there that was untoward and report back to the public that everything was in good order. If everything were not in good order, they could act in relation to that using their existing regulatory powers. So the amendment doesn't seek to in any way compromise Google and Facebook in their operations. It doesn't seek to require anything to be revealed about how they do their business. It seeks simply that the ACCC regularly audit these algorithms and deal with anything untoward within them.
No comments