Senate debates
Wednesday, 27 November 2024
Bills
Aged Care (Consequential and Transitional Provisions) Bill 2024, Online Safety Amendment (Social Media Minimum Age) Bill 2024; Second Reading
5:58 pm
Jenny McAllister (NSW, Australian Labor Party, Minister for Emergency Management) Share this | Link to this | Hansard source
I move:
That these bills be now read a second time.
I seek leave to have the second reading speeches incorporated in Hansard.
Leave granted.
The speeches read as follows—
AGED CARE (CONSEQUENTIAL AND TRANSITIONAL PROVISIONS) BILL 2024
I move:
That this bill be now read a second time.
The Aged Care (Consequential and Transitional Provisions) Bill 2024 makes transitional and consequential changes to support the commencement of the Aged Care Bill 2024.
It is an essential component in enabling us to move forward with once-in-a-generation reform of our aged care system and services.
The Bill operates so that the references to the 1997 Acts and the Commission Act in other Commonwealth legislation are read as references to the Aged Care Bill 2024.
Schedule 1 makes consequential changes to the Crimes Act 1914 and the National Disability Insurance Scheme Act 2013 to facilitate aged care worker screening checks and align aged care worker screening with NDIS worker screening.
It also makes changes to the Freedom of Information Act 1982, to give effect to Royal Commission recommendation 88 by removing aged care-specific FOI exemptions.
Schedule 2 of the Bill consists of 10 parts—6 of which correspond directly to chapters of the Aged Care Bill. This schedule details the transitional provisions which will ensure the smooth transition of individuals, providers, workers and governance arrangements when the new Aged Care Act commences.
During the exposure draft consultations, we heard very strongly that everyone needs time and support to prepare for the new Act.
With the changes to the aged care system which the new Act will bring, we recognise everyone will need support to understand what it means for them, what they need to do, and when they need to do it.
We will support older people, providers and workers to prepare for the changes arising from the new Act by providing clarity on what is new, what is changing and what is staying the same.
The Bill includes provisions that will ensure that individuals can move as seamlessly as possible from the 1997 Act, Commonwealth Home Support Program and National Aboriginal and Torres Strait Islander Flexible Aged Care Program to the new Act.
This Bill also includes provisions which will deem approved providers to be registered providers under the new Act. This will smooth the transition process for providers and ensure they can continue to provide services to older people, uninterrupted.
The Bill also includes technical provisions to clarify how provider obligations, governance arrangements, regulatory powers, information management and decision reviews will transition to the new Act.
The Bill also provides a crucial rule making power—to allow transition to be managed in a careful and considered way. The scale of this reform means we need to be able to act quickly during transition to address challenges and unforeseen impacts.
Finally, this Bill repeals the Aged Care Act 1997, Aged Care Quality and Safety Commission Act 2018 and the Aged Care (Transitional Provisions) Act 1997. From commencement, it will make the Aged Care Act 2024 the Commonwealth's primary aged care legislation and pave the way for the future of aged care.
ONLINE SAFETY AMENDMENT (SOCIAL MEDIA MINIMUM AGE) BILL 2024
Introduction
Keeping Australians safe online is a top priority for the Albanese Government. This Government is focused on positive solutions to issues of national concern, and the issue of harms to children and young people from social media is right up the top of that list.
Since coming to Government, the Albanese Labor Government have taken steps to turn this focus into meaningful outcomes, with a range of measures to help make the online environment better for young people.
Central to this has been the quadrupling of the ongoing base funding for the eSafety Commissioner, to ensure this world-leading regulator has the certainty of resourcing and is equipped to respond to existing and emerging online harms.
The Government has brought forward the review of the Online Safety Act by twelve months. This was in recognition of the need for the Act to remain fit for purpose, and the Government thanks the independent reviewer for recently providing her report.
In May this year, the Government amended the Basic Online Safety Expectations to make clear the Government's expectation that platforms must place the best interest of the child at the centre of their products and services.
Social media, as it is commonly and collectively understood, has become a ubiquitous and a normal part of life for all of Australia's society, and many young Australians take part in this activity without adverse consequence. It can be a source of entertainment, education and connection with the world—and each other. Those things, particularly for young Australians, can be beneficial.
But, for too many young Australians, social media can be harmful. Almost two-thirds of 14- to 17-year-old Australians have viewed extremely harmful content online, including drug abuse, suicide or self-harm, as well as violent material. A quarter have been exposed to content promoting unsafe eating habits.
Research conducted by eSafety found that that 95% of Australian caregivers find online safety to be one of their toughest parenting challenges.
The Albanese Government has heard the concerns of parents, young people and experts. Social media has a social responsibility to do better to address harms on their platforms. That's why the Government is making big changes to hold platforms to account for user safety.
The Online Safety Amendment (Social Media Minimum Age) Bill 2024 will amend the Online Safety Act 2021 (OSA) by introducing a minimum age of 16 to have an account on age-restricted social media platforms, protecting young Australians at a critical stage of their development.
The Bill puts the onus on social media platforms, not parents or young people, to take reasonable steps to ensure fundamental protections are in place. This is about protecting young people—not punishing or isolating them—and letting parents know the Government is in their corner when it comes to supporting their children's health and wellbeing.
The Government knows that establishing a minimum age for young people having social media accounts is not the only approach that needs to be taken, and also knows that this measure will not be met with universal acceptance.
But this is one step among many that a Government should take in the protection and not the isolation of young people. There is wide acknowledgment that something must be done, in the immediate term, to help prevent young teens and children from being exposed to streams of content—unfiltered and infinite.
It is the right thing to do for the right reasons at the right time.
Through extensive consultation and with the input of states and territories, the Government is agreeing that until a child turns 16, the social media environment as it stands is not age-appropriate for them.
The Government acknowledges everyone who participated in consultation across the country for their contribution, however small, to this world-leading reform.
This Bill does not provide the magic pill to resolve or eliminate every harm children faces online, nor does it seek to rule out digital participation and inclusion for young people.
And it is also acknowledged that harms don't simply switch off on a child's 16th birthday. That is why the Government has taken the decision to bring forward a key recommendation of the Online Safety Act Review, to legislate a Digital Duty of Care.
Legislating a Digital Duty of Care is a separate body of work, and will place the onus on digital platforms to proactively keep Australians safe and better prevent online harms.
Legislating a duty of care will mean services can't 'set and forget'. Instead, their obligations will mean they need to continually identify and mitigate potential risks, as technology and service offerings change and evolve.
While the social media minimum age legislation introduced today is targeted at the protection of children under 16, the duty of care will ensure all Australians are better protected from online harm.
Critically, this legislation will allow for a twelve month implementation period—to ensure this novel and world-leading reform can take effect with the care and consideration Australian's rightly expect.
The Office of the Australian Information Commissioner will be resourced to provide oversight of the privacy provisions as they relate to this Bill.
Regulated activity
This Bill establishes an obligation on social media platforms to take reasonable steps to prevent age-restricted users from having an account.
This places the onus on platforms to introduce systems and settings to ensure that under-age users cannot create and hold a social media account. A systemic failure to take action to limit such circumventions could give rise to a breach.
By regulating the act of 'having an account', as opposed to 'accessing' social media more generally, the Government is seeking to strike a balance between protecting young people from harm, while limiting the regulatory burden on the broader population.
Importantly, this obligation would help to mitigate the risks arising from the harmful features that are largely associated with user accounts, or the 'logged-in' state, persistent notifications and alerts, which have been found to have a negative impact on sleep, stress levels, and attention.
Regulated services
The obligation will apply to 'age-restricted social media platforms', a new term being introduced into the OSA. Its definition includes that a 'significant purpose' of the service is to enable online social interactions between 2 or more users.
While the definition casts a wide net, the Bill allows for flexibility to reduce the scope or further target the definition through legislative rules. Achieving this through rules, rather than primary legislation, enables the Government to be responsive to changes and evolutions in the dynamic social media ecosystem.
Rules can be made to allow for additional conditions that must be met, in order to fall within the definition of 'age-restricted social media platform'.
To be clear, the Government expects that this broader definition will capture services that are commonly accepted to be social media, and the services that are causing many parents the most concern.
This will, at a minimum, include TikTok, Facebook, Snapchat, Reddit, Instagram, X (formerly Twitter), among others. These services will be required to take reasonable steps to prevent persons under 16 years of age from creating or holding an account.
A rule-making power is also available to exclude specific classes of services from the definition. In the first instance, this power will be used to carve out messaging services, online games, and services that significantly function to support the health and education of users.
A key principle of the approach to applying an age limit of 16 to social media was the recognition that our laws should be set to protect young people—not isolate them. There is a legitimate purpose to enabling young people to actively participate in the digital environment where they have grown up. Supporting their digital participation, connection and inclusion is important at every age and stage of a young person's development and our legislation seeks to strike that balance.
The Government is not saying that risks don't exist on messenger apps or online gaming.
While users can still be exposed to harmful content by other users, they do not face the same algorithmic curation of content and psychological manipulation to encourage near endless engagement. Further, the inclusion of messaging apps could have wider consequences, such as making communication within families harder.
Online games are currently regulated under the National Classification Scheme. The Scheme provides information on the age suitability of online games through a combination of the classification and relevant consumer advice. Imposing additional age-based regulation to online games would create unnecessary regulatory overlap.
This categorical rule making power is expected to deem out of scope services such as Facebook Messenger Kids, and WhatsApp. The rule will provide for an "out of scope" status to also be applied to services like ReachOut's PeerChat, Kids Helpline 'MyCircle', Google Classroom, YouTube, and other apps that can be shown to function like social media in their interactivity but operate with a significant purpose to enable young people to get the education and health support they need.
Before making a Rule, the Minister must seek advice from the eSafety Commissioner, and must have regard to that advice; and may seek advice from any other authorities or agencies of the Commonwealth that the Minister considers relevant, and may have regard to any such advice.
This is an important condition to ensure Rules are made with the appropriate safeguards in place while reflecting community standards.
Privacy safeguards
Privacy reform in Australia has been long overdue, and the Albanese Government has taken significant steps to bring privacy standards up to community standards.
This is novel reform, and to implement a minimum age for social media requires steps to be taken by users to assure age. Where this user is a child, a Government must sharpen its focus with the expected level of care to ensure strong privacy provisions are in place.
While the digital economy has generated significant benefits, including consumer convenience, improved efficiencies and new employment opportunities, it has also resulted in large amounts of information about people being generated, used, disclosed and stored. Widespread adoption and reliance on digital technologies increases the risks that personal data will be subject to misuse or mishandling, including through data breaches, fraud and identity theft, unauthorised surveillance and other significant online harms.
For these reasons, the Bill introduces more robust privacy protections, which strictly prohibit platforms from using information collected for age assurance purposes for any other purpose, unless explicitly agreed to by the individual.
The approach taken in the Bill expands on Australia's privacy framework, taking a heightened approach to information protection that is informed by the 2022 review of the Privacy Act.
Compliance with the minimum age obligation will likely involve some form of age assurance, which may require the collection, use and disclosure of additional personal information. The Bill makes it explicit that platforms must not use information and data collected for age assurance purposes for any other purpose, unless the individual has provided their consent.
This consent must be voluntary, informed, current, specific and unambiguous—this is an elevated requirement that precludes platforms from seeking consent through preselected settings or opt-outs. In addition, once the information has been used for age assurance or any other agreed purpose, it must be destroyed by the platform (or any third party contracted by the platform).
Serious and repeated breaches of these privacy provisions could result in penalties of up to $50 million under section 13G of the Privacy Act.
Given the vitality of robust privacy and security for Australians online, in the case of the minimum age for social media, the Government will undertake additional consultation to determine what reasonable amendments we can introduce ahead of passage of the legislation.
Penalties
In making these reforms, it is critical the Government sends a clear signal to platforms about the importance of their social responsibilities to children and all Australians.
As such, this Bill will impose significant penalties for breaching the minimum age obligation. This will be as high as $49.5 million for bodies corporate, consistent with serious offences set out in the Privacy Act 1988 and Competition and Consumer Act 2010.
The Bill increases penalties for breach of industry codes and industry standards to up to $49.5 million for bodies corporate. This addresses the currently low penalties in the OSA, and reflects the systemic nature of the harms that could arise from breaches of the codes and standards.
Additional regulator powers
The Bill equips the eSafety Commissioner with additional tools and powers to effectively administer the new minimum age framework. This includes powers to request information from platforms about how they are complying with their obligation, particularly the compliance with privacy provisions.
Commencement
The minimum age obligation on social media services will commence no earlier than 12 months from passage of the Bill. This will allow the necessary time for social media platforms to develop and implement required systems.
This timeframe will also enable implementation to be informed by the Age Assurance Trial, which will provide guidance on the market readiness of age assurance technologies, and inform advice to Government and the eSafety Commissioner on implementation and enforcement of the minimum age.
Review
Finally, the Bill incorporates a review of the legislation two years after effective commencement. This provides the Government with an opportunity to undertake critical societal measurements of the impacts of the legislation, using qualitative and quantative research to understand how this policy is working for young Australians.
It will allow time to recognise any technological advancements since commencement, to reconsider the definition of an age-restricted social media platform, and to consider whether other digital platforms such as online games or additional social media platforms that can be viewed without an account, should be captured within scope.
The Government will work with the education, health, youth organisations and community organisations throughout implementation and during the review to take in their views.
Conclusion
This measure is a key component of the Albanese Government's work across the online safety space and will help enable young people to use the internet in a safer and more positive way. It will signal a set of normative values that support parents, educators and society more broadly.
Australia has consistently paved the way in global online safety, and the introduction of this legislation is no exception.
The Bill builds upon the Australian Government's work to address online harms for young people, including the age assurance trial, establishing an online dating apps code, legislating new criminal penalties for non-consensual sexual deepfakes.
The Government will ensure young Australians retain access to services that primarily provide education and health services, and work constructively with stakeholders to ensure that only services that meet the strict criteria under eSafety's powers are able to be accessed by children under 16 years.
This Bill seeks to set a new normative value in society—that accessing social media is not the defining feature of growing up in Australia.
Debate adjourned.
Ordered that the resumption of the debate be made an order of the day for a later hour.
Ordered that the bills be listed on the Notice Paper as separate orders of the day.