House debates

Thursday, 21 November 2024

Bills

Online Safety Amendment (Social Media Minimum Age) Bill 2024; Second Reading

9:25 am

Photo of Michelle RowlandMichelle Rowland (Greenway, Australian Labor Party, Minister for Communications) Share this | Hansard source

I move:

That this bill be now read a second time.

Introduction

Keeping Australians safe online is a top priority for the Albanese government. We are focused on positive solutions to issues of national concern, and the issue of harms to children and young people from social media is right up the top of that list.

Since coming to government we have taken steps to turn this focus into meaningful outcomes, with a range of measures to help make the online environment better for young people.

Central to this has been our quadrupling of the ongoing base funding for the eSafety Commissioner, to ensure this world-leading regulator has the certainty of resourcing and is equipped to respond to existing and emerging online harms.

I brought forward the review of the Online Safety Act by 12 months. This was in recognition of the need for the act to remain fit for purpose, and I thank the independent reviewer for recently providing her report to me.

In May this year, I amended the Basic Online Safety Expectations to make clear the government's expectation that platforms must place the best interest of the child at the centre of their products and services.

And today, I introduce the Online Safety Amendment (Social Media Minimum Age) Bill 2024.

Social media as we commonly and collectively understand it, has become a ubiquitous and a normal part of life for all of us, and many young Australians take part in this activity without adverse consequence. It can be a source of entertainment, education and connection with the world—and each other. Those things, particularly for young Australians, can be beneficial.

But, for too many young Australians, social media can be harmful. Almost two-thirds of 14- to 17-year-old Australians have viewed extremely harmful content online, including drug abuse, suicide or self-harm, as well as violent material. A quarter have been exposed to content promoting unsafe eating habits.

Research conducted by eSafety found that 95 per cent of Australian caregivers find online safety to be one of their toughest parenting challenges.

The Albanese government has heard the concerns of parents, young people and experts. Social media has a social responsibility. We know they can—and should—do better to address harms on their platforms. That's why we're making big changes to hold platforms to account for user safety.

The Online Safety Amendment (Social Media Minimum Age) Bill 2024 will amend the Online Safety Act 2021 (OSA) by introducing a minimum age of 16 to have an account on age-restricted social media platforms, protecting young Australians at a critical stage of their development.

The bill puts the onus on social media platforms, not parents or young people, to take reasonable steps to ensure fundamental protections are in place. This is about protecting young people—not punishing or isolating them—and letting parents know we're in their corner when it comes to supporting their children's health and wellbeing.

We know establishing a minimum age for young people having social media accounts is not the only approach that needs to be taken and we know that this measure will not be met with universal acceptance.

But this is one step among many that a government should take in the protection and not the isolation of young people.

There is wide acknowledgment that something must be done, in the immediate term, to help prevent young teens and children from being exposed to streams of content—unfiltered and infinite.

It is the right thing to do for the right reasons at the right time.

Through extensive consultation and with the input of states and territories, we are agreeing that until a child turns 16, the social media environment as it stands is not age-appropriate for them.

I acknowledge everyone who participated in consultation across the country for their contribution, however small, to this world-leading reform.

The bill I am introducing today does not provide the magic pill to resolve or eliminate every harm children face online, nor does it seek to rule out digital participation and inclusion for young people.

And we also acknowledge that harms don't simply switch off on a child's 16th birthday. That is why the government has taken the decision to bring forward a key recommendation of the online safety act review, to legislate a digital duty of care.

Legislating a digital duty of care is a separate body of work, and will place the onus on digital platforms to proactively keep Australians safe and better prevent online harms.

Legislating a duty of care will mean services can't 'set and forget'. Instead, their obligations will mean they need to continually identify and mitigate potential risks, as technology and service offerings change and evolve.

While the social media minimum age legislation introduced today is targeted at the protection of children under 16, the duty of care will ensure all Australians are better protected from online harm.

Critically, this legislation will allow for 12 months implementation—to ensure this novel and world-leading reform can take effect with the care and consideration that Australians rightly expect.

The Office of the Australian Information Commissioner will be resourced to provide oversight of the privacy provisions as they relate to this bill.

Regulated activity

The bill we have introduced today establishes an obligation on social media platforms to take reasonable steps to prevent age-restricted users from having an account.

This places the onus on platforms to introduce systems and settings to ensure that under-age users cannot create and hold a social media account. A systemic failure to take action to limit such circumventions could give rise to a breach.

By regulating the act of 'having an account', as opposed to 'accessing' social media more generally, we are seeking to strike a balance between protecting young people from harm, while limiting the regulatory burden on the broader population.

Importantly, this obligation would help to mitigate the risks arising from the harmful features that are largely associated with user accounts, or the 'logged-in' state, persistent notifications and alerts, which have been found to have a negative impact on sleep, stress levels, and attention.

Regulated services

The obligation will apply to 'age-restricted social media platforms', a new term being introduced into the Online Safety Act. Its definition includes that a 'significant purpose' of the service is to enable online social interactions between two or more users.

While the definition casts a wide net, the bill allows for flexibility to reduce the scope or further target the definition through legislative rules. Achieving this through rules, rather than primary legislation, enables the government to be responsive to changes and evolutions in the dynamic social media ecosystem.

Rules can be made to allow for additional conditions that must be met, in order to fall within the definition of 'age-restricted social media platform'.

To be clear, the government expects that this broader definition will capture services that are commonly accepted to be social media, and the services that are causing many parents the most concern.

This will, at a minimum, include TikTok, Facebook, Snapchat, Reddit, Instagram, X (formerly Twitter), among others. These services will be required to take reasonable steps to prevent persons under 16 years of age from creating or holding an account.

A rule-making power is also available to exclude specific classes of services from the definition. In the first instance, this power will be used to carve out messaging services, online games, and services that significantly function to support the health and education of users.

A key principle of the approach to applying an age limit of 16 to social media was the recognition that our laws should be set to protect young people—not isolate them. There is a legitimate purpose to enabling young people to actively participate in the digital environment where they have grown up.

Supporting their digital participation, connection and inclusion is important at every age and stage of a young person's development and our legislation seeks to strike that balance.

We are not saying that risks don't exist on messenger apps or online gaming.

While users can still be exposed to harmful content by other users, they do not face the same algorithmic curation of content and psychological manipulation to encourage near endless engagement. Further, the inclusion of messaging apps could have wider consequences, such as making communication within families harder.

Online games are currently regulated under the National Classification Scheme. The scheme provides information on the age suitability of online games through a combination of the classification and relevant consumer advice. Imposing additional age-based regulation to online games would create unnecessary regulatory overlap.

This categorical rule-making power is expected to deem out of scope services such as Facebook Messenger Kids, and WhatsApp. The rule will provide for an 'out of scope' status to also be applied to services like ReachOut's PeerChat, Kids Helpline 'MyCircle', Google Classroom, YouTube, and other apps that can be shown to function like social media in their interactivity but operate with a significant purpose to enable young people to get the education and health support they need.

Before making a rule, the minister must seek advice from the eSafety Commissioner, and must have regard to that advice; and may seek advice from any other authorities or agencies of the Commonwealth that the minister considers relevant, and may have regard to any such advice.

This is an important condition to ensure rules are made with the appropriate safeguards in place while reflecting community standards.

Privacy safeguards

Privacy reform in Australia has been long overdue, and the Albanese government has taken significant steps to bring privacy standards up to community standards.

This is novel reform, and to implement a minimum age for social media requires steps to be taken by users to assure age. Where this user is a child, a government must sharpen its focus with the expected level of care to ensure strong privacy provisions are in place.

While the digital economy has generated significant benefits, including consumer convenience, improved efficiencies and new employment opportunities, it has also resulted in large amounts of information about people being generated, used, disclosed and stored. Widespread adoption and reliance on digital technologies increases the risks that personal data will be subject to misuse or mishandling, including through data breaches, fraud and identity theft, unauthorised surveillance and other significant online harms.

For these reasons, the bill introduces more robust privacy protections, which strictly prohibit platforms from using information collected for age assurance purposes for any other purpose, unless explicitly agreed to by the individual.

The approach taken in the bill expands on Australia's privacy framework, taking a heightened approach to information protection that is informed by the 2022 review of the Privacy Act.

Compliance with the minimum-age obligation will likely involve some form of age assurance, which may require the collection, use and disclosure of additional personal information. The bill makes it explicit that platforms must not use information and data collected for age assurance purposes for any other purpose, unless the individual has provided their consent.

This consent must be voluntary, informed, current, specific and unambiguous—this is an elevated requirement that precludes platforms from seeking consent through preselected settings or opt-outs. In addition, once the information has been used for age assurance or any other agreed purpose, it must be destroyed by the platform (or any third party contracted by the platform).

Serious and repeated breaches of these privacy provisions could result in penalties of up to $50 million under section 13G of the Privacy Act.

Given the vitality of robust privacy and security for Australians online, in the case of the minimum age for social media, we will undertake additional consultation to determine what reasonable amendments we can introduce ahead of passage of the legislation.

Penalties

In making these reforms, it is critical we send a clear signal to platforms about the importance of their social responsibilities to children and all Australians.

As such, this bill will impose significant penalties for breaching the minimum-age obligation. This will be as high as $49.5 million for bodies corporate, consistent with serious offences set out in the Privacy Act 1988 and Competition and Consumer Act 2010.

The bill increases penalties for breach of industry codes and industry standards to up to $49.5 million for bodies corporate. This addresses the currently low penalties in the OSA, and reflects the systemic nature of the harms that could arise from breaches of the codes and standards.

Additional regulator powers

The bill equips the eSafety Commissioner with additional tools and powers to effectively administer the new minimum age framework. This includes powers to request information from platforms about how they are complying with their obligation, particularly the compliance with privacy provisions.

Commencement

The minimum age obligation on social media services will commence no earlier than 12 months from passage of the bill. This will allow the necessary time for social media platforms to develop and implement required systems.

This timeframe will also enable implementation to be informed by the age assurance trial, which will provide guidance on the market readiness of age assurance technologies, and inform advice to government and the eSafety Commissioner on implementation and enforcement of the minimum age.

Review

Finally, the bill incorporates a review of the legislation two years after effective commencement. This provides the government with an opportunity to undertake critical societal measurements of the impacts of the legislation, using qualitative and quantative research to understand how this policy is working for young Australians.

It will allow time to recognise any technological advancements since commencement, to reconsider the definition of an age-restricted social media platform, and to consider whether other digital platforms such as online games or additional social media platforms that can be viewed without an account should be captured within scope.

We will work with the education, health, youth organisations and community organisations throughout implementation and during the review to take in their views.

Conclusion

This measure is a key component of the Albanese government's work across the online safety space and will help enable young people to use the internet in a safer and more positive way. It will signal a set of normative values that support parents, educators and society more broadly.

Australia has consistently paved the way in global online safety, and the introduction of this legislation is no exception.

The bill builds upon the Australian government's work to address online harms for young people, including the age assurance trial, establishing an online dating apps code, and legislating new criminal penalties for nonconsensual sharing of sexual deepfakes.

The government will ensure young Australians retain access to services that primarily provide education and health services, and work constructively with stakeholders to ensure that only services that meet the strict criteria under eSafety's powers are able to be accessed by children under 16 years.

This bill seeks to set a new normative value in society—that accessing social media is not the defining feature of growing up in Australia.

I commend the bill to the House.

Debate adjourned.

Comments

No comments