House debates
Wednesday, 27 November 2024
Bills
Online Safety Amendment (Social Media Minimum Age) Bill 2024; Consideration in Detail
9:39 am
Zoe Daniel (Goldstein, Independent) Share this | Hansard source
by leave—I move amendments (1) to (7), as circulated in my name, together:
(1) Schedule 1, heading to Part 1, page 3 (line 2), at the end of the heading, add "relating to social media minimum age".
(2) Schedule 1, item 1, page 3 (lines 6 and 7), after "A provider", insert "(other than an exempt provider)".
(3) Schedule 1, page 3 (after line 18), after item 3, insert:
3A Section 5
Insert:
exempt provider has the meaning given by section 63CA.
(4) Schedule 1, item 7, page 4 (line 18), after "Providers", insert "(other than exempt providers)".
(5) Schedule 1, item 7, page 6 (after line 22), after section 63C, insert:
63CA Exempt providers
(1) For the purposes of this Act, a provider of an age-restricted social media platform is an exempt provider at a particular time if, at that time:
(a) the provider has complied with:
(i) its risk assessment obligations under section 63CB; and
(ii) its risk mitigation obligations under section 63CC;
in relation to the age-restricted social media platform; and
(b) a determination under subsection (2) is not in force in relation to the provider and the age-restricted social media platform.
(2) If the Commissioner believes on reasonable grounds that a provider has not complied with its obligations mentioned in paragraph (1)(a) in relation to an age-restricted social media platform, the Commissioner may make a determination, in writing, that the provider is not an exempt provider in relation to the age-restricted social media platform.
63CB Risk assessment obligations of providers
(1) A provider of an age-restricted social media platform must undertake an assessment (a risk assessment) that identifies and assesses the risks associated with providing the service to age-restricted users.
(2) Without limiting subsection (1), the provider must have regard to the following matters in undertaking the risk assessment:
(a) the dissemination of illegal and harmful materials to children;
(b) the dissemination of online scams to children;
(c) negative effects on children's best interest;
(d) serious negative consequences to children, including their physical and mental wellbeing;
(e) the matters (if any) specified in the legislative rules.
(3) The provider must exercise due diligence in undertaking the risk assessment.
(4) The provider must, as soon as practicable after the end of each financial year:
(a) give the Commissioner a copy of its risk assessment as prepared under subsection (1); and
(b) publish that risk assessment on a publicly available website.
(5) A risk assessment given or published under subsection (4) must be accompanied by a report including the following:
(a) details of the risks identified;
(b) indications of the severity of the risks;
(c) measures of the scale of the risks in Australia;
(d) a risk mitigation plan about managing and mitigating the risks in accordance with subsection 63CC(2).
63CC Risk mitigation obligations of providers
(1) A provider of an age-restricted social media platform must have policies, procedures and systems to monitor, manage and mitigate risks associated with providing the service to age-restricted users.
(2) A provider of an age-restricted social media platform must prepare a risk mitigation plan in relation to risks identified in a risk assessment prepared by the provider under section 63CB.
(3) The plan must identify measures to manage and mitigate those risks.
(4) Without limiting subsection (3), such measures may include any one or more of the following:
(a) changing advertising systems, including the way advertisements are targeted at or presented to children;
(b) improving internal business processes to maximise safety;
(c) taking targeted measures to improve child safety, such as age assurance or parental control tools;
(d) taking into account the best interests of children when making decisions.
(5) The provider must take reasonable steps to implement those measures.
(6) Schedule 1, item 7, page 6 (line 26), after "A provider", insert "(other than an exempt provider)".
(7) Schedule 1, page 12 (after line 4), after Part 2, insert:
Part 2A — Amendment of the Online Safety Act 2021 relating to data access schemes
Online Safety Act 2021
17A Section 5
Insert:
data access scheme means a scheme prescribed by rules made for the purposes of section 160B.
17B After Part 9
Insert:
Part 9A — Data access schemes
160A Simplified outline of this Part
The Commissioner may make rules to provide for one or more data access schemes. The rules may require a provider of a social media service, relevant electronic service or designated internet service to give access to data to approved independent researchers in relation to the safety of end-users of the service.
160B Commissioner may make rules for data access schemes
(1) The Commissioner may, by legislative instrument, determine rules providing for one or more data access schemes that require a provider of a social media service, relevant electronic service or designated internet service to give access to data for the purposes of independent research about the identification, assessment and mitigation of risks in relation to the safety of end-users of the service.
(2) Without limiting subsection (1), rules made for the purposes of that subsection may:
(a) provide for different means by which approved independent researchers are given access to data under the rules; and
(b) provide for procedures relating to requests for access to data by approved independent researchers under the rules including, but not limited to, time limits within which providers must deal with requests for such access; and
(c) make provision in relation to fees that may be charged by providers before access to data is given to an approved independent researcher, under the rules, which must not exceed the reasonable costs arising from providing the access; and
(d) provide for the giving of information to the Commissioner by providers in relation to the following:
(i) requests for access to data under the rules;
(ii) actions taken in relation to such requests; and
(e) provide for approved independent researchers to give information to the Commissioner on a voluntary basis, being information that the approved independent researcher obtained under the rules.
(3) If rules are made for the purposes of subsection (1), the rules must provide for criteria to be satisfied before the Commissioner approves a researcher to be given access to data under those rules, in relation to one or more of the following:
(a) the research project in respect of which the researcher is to be given access to data;
(b) the qualifications and experience of the researcher;
(c) how conflicts of interest will be dealt with;
(d) the organisation (if any) on behalf of which the researcher will conduct the research project concerned;
(e) the proposed sources of funding for the research project concerned;
(f) the technical and organisational capacity to fulfil data security and confidentiality requirements of:
(i) the researcher; and
(ii) the organisation (if any) on behalf of which the researcher will conduct the research project concerned.
(4) Rules made for the purposes of subsection (3) must also provide that a researcher must not be approved to be given access to data under those rules unless:
(a) the researcher certifies that data to which access is given under the rules will be used by the researcher only in conducting the research project concerned, and that the results of the research project will be made widely available; and
(b) the researcher certifies that the researcher will not use the results of the research project concerned, or data to which access is given under the rules, for the purpose of obtaining commercial benefit or advantage.
(5) Rules made for the purposes of subsection (3) must also provide that a researcher must not be approved to be given access to data under those rules unless the researcher is willing to enter into a standard form confidentiality agreement (see subsection (6)) with each provider that will give the researcher access to data.
(6) Rules made for the purposes of subsection (3) must also prescribe a standard form confidentiality agreement, which:
(a) must be directed towards protecting the security of services and data; and
(b) without limiting paragraph (a), must specify that, if data to which access is provided includes information that is:
(i) protected information; or
(ii) personal information (within the meaning of the Privacy Act 1988); or
(iii) any information the disclosure of which the provider reasonably considers might cause a significant security vulnerability for the service or provider;
the information will not be further disclosed except:
(iv) in the case of information other than personal information (within the meaning of the Privacy Act 1988)—with the consent of the provider that provided access to the data; or
(v) in any case—otherwise as required or permitted by law.
160C Compliance with rules regarding data access
A provider must not contravene rules made for the purposes of section 160B.
Civil penalty: 500 penalty units.
160D Remedial directions — contravention of rules regarding data access
(1) This section applies if the Commissioner is satisfied that a provider has contravened, or is contravening, rules made for the purposes of section 160B.
(2) The Commissioner may give the provider a written direction requiring the provider to take specified action directed towards ensuring that the provider does not contravene rules made for the purposes of section 160B, or is unlikely to contravene those rules, in the future.
(3) A provider must not contravene a direction under subsection (2).
Civil penalty: 500 penalty units.
17C At the end of subsection 163(1) (before the note)
Add:
; (q) section 160C;
(r) section 160D.
17D At the end of subsection 164(1)
Add:
; (r) section 160C;
(s) section 160D.
17E After paragraph 165(1)(r)
Insert:
; (ra) section 160C;
(rb) section 160D.
I thank the minister's office for the deep engagement on this bill. I spoke extensively yesterday on my reservations about it, on which I won't go into detail again now, other than to say that what I've attempted to do with these amendments is to create an exemption framework in order that social media platforms may be able to bridge the gap between having to ban young people from being inside a platform and actually doing something about the way that they run their platforms.
I have a really strong position that it's far too simplistic and it's a bandaid measure to simply lock young people out of the platforms, and that where we need to get to—and I'm not entirely convinced that the minister disagrees with me on this, because I know that the minister has spoken about a duty of care—is getting the platforms to take responsibility for what is happening. I strongly believe that simply locking kids out is not going to achieve that. It's not going to make the platforms safer. It's not going to make them more transparent. It's not going to make them manage their algorithms. It's not going to make them either identify or manage risk.
One way of achieving that within this bill, to some degree, would be to create some provisions within this amendment to the Online Safety Act which would incentivise platforms to actually do something about those things. So, in effect, what these amendments do is to progressively make the platforms safer over time and allow the minister or regulators to home in on specific risks. It would mean that the platforms would have to assess their services for various harms—things like dissemination of scams, and negative effects on children's best interests, including their physical and mental wellbeing. It would mean that they'd be required to prepare a mitigation plan which would demonstrate to the government and to the public exactly how they'd reprogram or alter their algorithms and their systems. It would mean that they would have to do risk assessments and mitigation mechanisms together—and this has been demonstrated, internationally, to be an effective tool to increase platform safety, and I don't think there's any reason that we can't begin implementing it here. The point is that, if platforms want to actually move forward and be exempt from the ban, these amendments give them the opportunity to step into the idea of safety-by-design, which I think is really where we do need to get to.
I know that the minister has flagged a duty of care down the track. I tabled a private member's bill—in effect, providing a framework for that—on Monday, but I think a step towards that which would be very productive right now would be to insert these amendments into this bill in order to start moving us towards a systemic approach to online safety, rather than this, I think, simplistic, unfortunately, measure of age bans. So I commend the amendments to the House.
No comments