House debates

Wednesday, 27 November 2024

Bills

Online Safety Amendment (Social Media Minimum Age) Bill 2024; Consideration in Detail

9:39 am

Photo of Zoe DanielZoe Daniel (Goldstein, Independent) Share this | | Hansard source

by leave—I move amendments (1) to (7), as circulated in my name, together:

(1) Schedule 1, heading to Part 1, page 3 (line 2), at the end of the heading, add "relating to social media minimum age".

(2) Schedule 1, item 1, page 3 (lines 6 and 7), after "A provider", insert "(other than an exempt provider)".

(3) Schedule 1, page 3 (after line 18), after item 3, insert:

3A Section 5

Insert:

exempt provider has the meaning given by section 63CA.

(4) Schedule 1, item 7, page 4 (line 18), after "Providers", insert "(other than exempt providers)".

(5) Schedule 1, item 7, page 6 (after line 22), after section 63C, insert:

63CA Exempt providers

(1) For the purposes of this Act, a provider of an age-restricted social media platform is an exempt provider at a particular time if, at that time:

(a) the provider has complied with:

(i) its risk assessment obligations under section 63CB; and

(ii) its risk mitigation obligations under section 63CC;

in relation to the age-restricted social media platform; and

(b) a determination under subsection (2) is not in force in relation to the provider and the age-restricted social media platform.

(2) If the Commissioner believes on reasonable grounds that a provider has not complied with its obligations mentioned in paragraph (1)(a) in relation to an age-restricted social media platform, the Commissioner may make a determination, in writing, that the provider is not an exempt provider in relation to the age-restricted social media platform.

63CB Risk assessment obligations of providers

(1) A provider of an age-restricted social media platform must undertake an assessment (a risk assessment) that identifies and assesses the risks associated with providing the service to age-restricted users.

(2) Without limiting subsection (1), the provider must have regard to the following matters in undertaking the risk assessment:

(a) the dissemination of illegal and harmful materials to children;

(b) the dissemination of online scams to children;

(c) negative effects on children's best interest;

(d) serious negative consequences to children, including their physical and mental wellbeing;

(e) the matters (if any) specified in the legislative rules.

(3) The provider must exercise due diligence in undertaking the risk assessment.

(4) The provider must, as soon as practicable after the end of each financial year:

(a) give the Commissioner a copy of its risk assessment as prepared under subsection (1); and

(b) publish that risk assessment on a publicly available website.

(5) A risk assessment given or published under subsection (4) must be accompanied by a report including the following:

(a) details of the risks identified;

(b) indications of the severity of the risks;

(c) measures of the scale of the risks in Australia;

(d) a risk mitigation plan about managing and mitigating the risks in accordance with subsection 63CC(2).

63CC Risk mitigation obligations of providers

(1) A provider of an age-restricted social media platform must have policies, procedures and systems to monitor, manage and mitigate risks associated with providing the service to age-restricted users.

(2) A provider of an age-restricted social media platform must prepare a risk mitigation plan in relation to risks identified in a risk assessment prepared by the provider under section 63CB.

(3) The plan must identify measures to manage and mitigate those risks.

(4) Without limiting subsection (3), such measures may include any one or more of the following:

(a) changing advertising systems, including the way advertisements are targeted at or presented to children;

(b) improving internal business processes to maximise safety;

(c) taking targeted measures to improve child safety, such as age assurance or parental control tools;

(d) taking into account the best interests of children when making decisions.

(5) The provider must take reasonable steps to implement those measures.

(6) Schedule 1, item 7, page 6 (line 26), after "A provider", insert "(other than an exempt provider)".

(7) Schedule 1, page 12 (after line 4), after Part 2, insert:

Part 2A — Amendment of the Online Safety Act 2021 relating to data access schemes

Online Safety Act 2021

17A Section 5

Insert:

data access scheme means a scheme prescribed by rules made for the purposes of section 160B.

17B After Part 9

Insert:

Part 9A — Data access schemes

160A Simplified outline of this Part

The Commissioner may make rules to provide for one or more data access schemes. The rules may require a provider of a social media service, relevant electronic service or designated internet service to give access to data to approved independent researchers in relation to the safety of end-users of the service.

160B Commissioner may make rules for data access schemes

(1) The Commissioner may, by legislative instrument, determine rules providing for one or more data access schemes that require a provider of a social media service, relevant electronic service or designated internet service to give access to data for the purposes of independent research about the identification, assessment and mitigation of risks in relation to the safety of end-users of the service.

(2) Without limiting subsection (1), rules made for the purposes of that subsection may:

(a) provide for different means by which approved independent researchers are given access to data under the rules; and

(b) provide for procedures relating to requests for access to data by approved independent researchers under the rules including, but not limited to, time limits within which providers must deal with requests for such access; and

(c) make provision in relation to fees that may be charged by providers before access to data is given to an approved independent researcher, under the rules, which must not exceed the reasonable costs arising from providing the access; and

(d) provide for the giving of information to the Commissioner by providers in relation to the following:

(i) requests for access to data under the rules;

(ii) actions taken in relation to such requests; and

(e) provide for approved independent researchers to give information to the Commissioner on a voluntary basis, being information that the approved independent researcher obtained under the rules.

(3) If rules are made for the purposes of subsection (1), the rules must provide for criteria to be satisfied before the Commissioner approves a researcher to be given access to data under those rules, in relation to one or more of the following:

(a) the research project in respect of which the researcher is to be given access to data;

(b) the qualifications and experience of the researcher;

(c) how conflicts of interest will be dealt with;

(d) the organisation (if any) on behalf of which the researcher will conduct the research project concerned;

(e) the proposed sources of funding for the research project concerned;

(f) the technical and organisational capacity to fulfil data security and confidentiality requirements of:

(i) the researcher; and

(ii) the organisation (if any) on behalf of which the researcher will conduct the research project concerned.

(4) Rules made for the purposes of subsection (3) must also provide that a researcher must not be approved to be given access to data under those rules unless:

(a) the researcher certifies that data to which access is given under the rules will be used by the researcher only in conducting the research project concerned, and that the results of the research project will be made widely available; and

(b) the researcher certifies that the researcher will not use the results of the research project concerned, or data to which access is given under the rules, for the purpose of obtaining commercial benefit or advantage.

(5) Rules made for the purposes of subsection (3) must also provide that a researcher must not be approved to be given access to data under those rules unless the researcher is willing to enter into a standard form confidentiality agreement (see subsection (6)) with each provider that will give the researcher access to data.

(6) Rules made for the purposes of subsection (3) must also prescribe a standard form confidentiality agreement, which:

(a) must be directed towards protecting the security of services and data; and

(b) without limiting paragraph (a), must specify that, if data to which access is provided includes information that is:

(i) protected information; or

(ii) personal information (within the meaning of the Privacy Act 1988); or

(iii) any information the disclosure of which the provider reasonably considers might cause a significant security vulnerability for the service or provider;

the information will not be further disclosed except:

(iv) in the case of information other than personal information (within the meaning of the Privacy Act 1988)—with the consent of the provider that provided access to the data; or

(v) in any case—otherwise as required or permitted by law.

160C Compliance with rules regarding data access

A provider must not contravene rules made for the purposes of section 160B.

Civil penalty: 500 penalty units.

160D Remedial directions — contravention of rules regarding data access

(1) This section applies if the Commissioner is satisfied that a provider has contravened, or is contravening, rules made for the purposes of section 160B.

(2) The Commissioner may give the provider a written direction requiring the provider to take specified action directed towards ensuring that the provider does not contravene rules made for the purposes of section 160B, or is unlikely to contravene those rules, in the future.

(3) A provider must not contravene a direction under subsection (2).

Civil penalty: 500 penalty units.

17C At the end of subsection 163(1) (before the note)

Add:

; (q) section 160C;

(r) section 160D.

17D At the end of subsection 164(1)

Add:

; (r) section 160C;

(s) section 160D.

17E After paragraph 165(1)(r)

Insert:

; (ra) section 160C;

(rb) section 160D.

I thank the minister's office for the deep engagement on this bill. I spoke extensively yesterday on my reservations about it, on which I won't go into detail again now, other than to say that what I've attempted to do with these amendments is to create an exemption framework in order that social media platforms may be able to bridge the gap between having to ban young people from being inside a platform and actually doing something about the way that they run their platforms.

I have a really strong position that it's far too simplistic and it's a bandaid measure to simply lock young people out of the platforms, and that where we need to get to—and I'm not entirely convinced that the minister disagrees with me on this, because I know that the minister has spoken about a duty of care—is getting the platforms to take responsibility for what is happening. I strongly believe that simply locking kids out is not going to achieve that. It's not going to make the platforms safer. It's not going to make them more transparent. It's not going to make them manage their algorithms. It's not going to make them either identify or manage risk.

One way of achieving that within this bill, to some degree, would be to create some provisions within this amendment to the Online Safety Act which would incentivise platforms to actually do something about those things. So, in effect, what these amendments do is to progressively make the platforms safer over time and allow the minister or regulators to home in on specific risks. It would mean that the platforms would have to assess their services for various harms—things like dissemination of scams, and negative effects on children's best interests, including their physical and mental wellbeing. It would mean that they'd be required to prepare a mitigation plan which would demonstrate to the government and to the public exactly how they'd reprogram or alter their algorithms and their systems. It would mean that they would have to do risk assessments and mitigation mechanisms together—and this has been demonstrated, internationally, to be an effective tool to increase platform safety, and I don't think there's any reason that we can't begin implementing it here. The point is that, if platforms want to actually move forward and be exempt from the ban, these amendments give them the opportunity to step into the idea of safety-by-design, which I think is really where we do need to get to.

I know that the minister has flagged a duty of care down the track. I tabled a private member's bill—in effect, providing a framework for that—on Monday, but I think a step towards that which would be very productive right now would be to insert these amendments into this bill in order to start moving us towards a systemic approach to online safety, rather than this, I think, simplistic, unfortunately, measure of age bans. So I commend the amendments to the House.

9:42 am

Photo of Michelle RowlandMichelle Rowland (Greenway, Australian Labor Party, Minister for Communications) Share this | | Hansard source

I thank the member. These issues have been well ventilated, and I acknowledge the sentiments contained in her amendments. The Senate is expecting this legislation today, so we need to pass this bill to allow that debate to commence. I thank all members for their contributions. While the government will not be supporting amendments in the House, I call on all members to support the passage of this bill.

9:43 am

Photo of Ms Catherine KingMs Catherine King (Ballarat, Australian Labor Party, Minister for Infrastructure, Transport, Regional Development and Local Government) Share this | | Hansard source

I move:

That the question be now put.

Photo of Ian GoodenoughIan Goodenough (Moore, Liberal Party) Share this | | Hansard source

The question is that the question be now put.

9:55 am

Photo of Sharon ClaydonSharon Claydon (Newcastle, Australian Labor Party) Share this | | Hansard source

The question is that the amendments be agreed to.

9:56 am

Photo of Allegra SpenderAllegra Spender (Wentworth, Independent) Share this | | Hansard source

I move:

(1) Schedule 1, item 16, page 11 (after line 21), at the end of section 239B, add:

(4) If the report of the review sets out one or more recommendations to the Commonwealth Government, the Minister must, within 6 months after receiving the report:

(a) cause to be prepared a statement setting out:

(i) the Commonwealth Government's response to each of the recommendations; and

(ii) if the Commonwealth Government has not accepted a recommendation—the reasons for not accepting the recommendation; and

(b) cause copies of the statement to be tabled in each House of the Parliament.

The evidence is clear that social media platforms are not safe for our kids, and the community wants action. While I have great concerns around the process around the Online Safety Amendment (Social Media Minimum Age) Bill 2024, I will be supporting this bill. In Wentworth, more than nine in 10 people who responded to my community survey said they were concerned about the impact of social media on young people, highlighting issues like online bullying, screen addiction and exposure to harmful content. Parents in my community want their kids growing up playing outside with their friends, building real relationships with real people, rather than being stuck inside staring at a screen. As a parent myself, that is what I want for my children too. But, even if it is done right, imposing a minimum age of 16 for social media is a blunt instrument and an unprecedented intervention into the lives of our young people.

For such a profound piece of legislation, the haste with which the government has rushed this through the parliament, the lack of engagement that has been undertaken with certain affected stakeholders and the unanswered questions that remain for many parliamentarians is concerning, including what happened in the House just now, where the minister did not even deign to reply to the issues raised in consideration-in-detail amendments by the member for Goldstein. This is absolutely riding roughshod over parliament in a completely unacceptable and unaccountable way.

I want this legislation to work, but the jury is out in terms of whether it will actually be able to have the impact that people want it to and the practicality of whether it can be put in place. That is why I believe it's absolutely critical that, after the ban has been implemented, we have a robust process in place to assess its effectiveness and adjust our approach accordingly. We need to be guided by the evidence, not by obstinacy. Clause 239B of the bill requires the government to commission an independent review of the social media minimum age ban two years after it comes into effect. The independent review is welcome and important. It gives the government of the day the chance to review whether the plan is working and adapt its approach if not. It will have the benefit of evidence from the government's age-assurance trial, the duty of care for social media companies hopefully having been legislated and the two years worth of data on the effectiveness of the ban. However, on the way the review provision is currently drafted, my understanding is that there is not currently a requirement for the government to respond to the review's recommendations. This creates the potential for a scenario like we have with gambling ads, where a detailed inquiry has made clear recommendations to the government and yet we are still waiting for a proper response.

My amendment would ensure that this does not happen by requiring the government to formally respond to any recommendations made by the independent review within six months of receiving it. It is a modest and commonsense amendment that reflects similar requirements in places like New South Wales, where the government of the day is required to provide a formal response to certain inquiries within six months. I understand that the government does not intend to support any amendments in the House, but I would request that the minister commits today to provide a formal response to the independent review as and when it is presented in a few years time.

10:00 am

Photo of Ms Catherine KingMs Catherine King (Ballarat, Australian Labor Party, Minister for Infrastructure, Transport, Regional Development and Local Government) Share this | | Hansard source

I move:

That the question be now put.

Photo of Sharon ClaydonSharon Claydon (Newcastle, Australian Labor Party) Share this | | Hansard source

The question is that the question be now put.

A division having been called and the bells being rung—

Photo of Adam BandtAdam Bandt (Melbourne, Australian Greens) Share this | | Hansard source

Deputy Speaker, on a point of order, my understanding was that you called 'ring the bells for one minute'. It's well past one minute.

Photo of Sharon ClaydonSharon Claydon (Newcastle, Australian Labor Party) Share this | | Hansard source

I corrected myself and said four minutes. The question is that the question be now put.

10:09 am

Photo of Sharon ClaydonSharon Claydon (Newcastle, Australian Labor Party) Share this | | Hansard source

The question now is that the amendment be agreed to.

10:13 am

Photo of Rebekha SharkieRebekha Sharkie (Mayo, Centre Alliance) Share this | | Hansard source

by leave—I move amendments (1) to (5) as circulated in my name:

(1) Schedule 1, item 7, page 4 (after line 21), after the paragraph beginning "Providers of certain kinds" in section 63A, insert:

    (2) Schedule 1, item 7, page 6 (line 23), omit "penalty", substitute "penalties".

    (3) Schedule 1, item 7, page 7 (after line 13), after section 63E, insert:

    63EA Civil penalty for requiring identification documents

    (1) A provider of an age-restricted social media platform must not require an individual to produce an identification document for the purposes of verifying the age of the individual.

    Civil penalty: 500 penalty units.

    (2) In this section:

    identification document means a document or other thing that:

    (a) contains identification information; and

    (b) can be used to identify an individual; and

    (c) is issued by or on behalf of a government authority.

    (4) Schedule 1, item 7, page 10 (after line 2), after paragraph 63J(a), insert:

    (aa) has contravened section 63EA (requiring identification documents); or

    (5) Schedule 1, item 13, page 11 (after line 3), after paragraph (da), insert:

    (daa) section 63EA;

    I will not hold the House for long. The amendments seek to address age verification, and this bill explicitly defers the commencement of age verification to provide the industry and the eSafety Commissioner with sufficient time to develop and implement appropriate systems. My amendment would prevent social media platforms and other tech giants from requiring personal ID documentation for the purposes of age verification. Personal data is highly valued by corporations and particularly by social media companies, and so I think it's incredibly important that we do not provide an environment that greenlights the harvesting of personal data under the guise of protecting under-16s. I therefore call on the House to support these amendments to provide some safety around this bill for the personal data of millions of Australians with social media accounts.

    Photo of Sharon ClaydonSharon Claydon (Newcastle, Australian Labor Party) Share this | | Hansard source

    The question is that the amendments be agreed to. All those of that opinion—do you wish to have the call, Assistant Minister?

    10:15 am

    Photo of Kate ThwaitesKate Thwaites (Jagajaga, Australian Labor Party, Assistant Minister for Women) Share this | | Hansard source

    I move:

    That the question be now put.

    Photo of Sharon ClaydonSharon Claydon (Newcastle, Australian Labor Party) Share this | | Hansard source

    The question is that the question be now put.

    A division having been called and the bells being rung—

    What is the member for Warringah after?

    Photo of Zali SteggallZali Steggall (Warringah, Independent) Share this | | Hansard source

    I just want a clarification. The member for Mayo moved her amendments. I understood the Deputy Speaker to have put the question on the amendments. The assistant minister—

    Photo of Sharon ClaydonSharon Claydon (Newcastle, Australian Labor Party) Share this | | Hansard source

    No, I stated the question; I didn't put the question, and the assistant minister has moved that the question be put now. If it helps the House, the question before us is that the question be put.

    10:22 am

    Photo of Sharon ClaydonSharon Claydon (Newcastle, Australian Labor Party) Share this | | Hansard source

    The question now is that the amendments be agreed to.

    10:33 am

    Photo of Milton DickMilton Dick (Speaker) Share this | | Hansard source

    The question is that the bill be agreed to.