House debates
Tuesday, 26 November 2024
Bills
Online Safety Amendment (Social Media Minimum Age) Bill 2024; Second Reading
6:04 pm
Monique Ryan (Kooyong, Independent) Share this | Hansard source
The question of an increase in the age threshold for the creation of social media accounts from 13 to 16 has been brought into political consideration over recent months because of concerns regarding the impact of social media on the mental health and wellbeing of Australian children.
The advent of social media has fundamentally transformed how humans spend their time and communicate. The repercussions of the changes are profound. Accumulating research has linked the amount of time children spend online and excessive and addictive use of digital media with adverse physical, psychological, social and neurological consequences. Researchers acknowledge a link between mental health disorders and social media usage. But it is important to note that relationship does not confirm causation. The thesis of parents groups and a recent pretty strong campaign by the Murdoch press is that increasing the age of those allowed on social media from 13 to 16 years would protect vulnerable young people from the dangers of social media.
It is true that if age verification could be comprehensively and effectively enforced, children and young people would be better protected from risks like cyberbullying and online predators. Limiting social media access might increase time for children to make other more healthy lifestyle choices, like exercising or socialising in person with others, and that could improve their mental health, sleep and academic performance. Restricting access to the digital world might empower parents and guardians to better guide their children's online activities. A complete ban, were that possible, might also help with online privacy concerns. Social media platforms collect vast amounts of personal data from users. Children and adolescents simply don't simply understand how their data is collected, used and monetised. Banning access for younger users might mitigate privacy violations and limit the exploitation of personal data by large technology companies. Each of those concerns does carry some truth.
We do need to address the harms caused by social media platforms. However, the overwhelming weight of expert evidence in this area is against this government's simplistic proposal of a blanket ban for young people aged less than 16. This country has a confusing array of laws around young people. You can be criminally responsible at 10, you can join the Labor Party at 14, you can join the Liberal Party at 16, you can work full-time from 16 and you can vote at 18. The fact is that digital media are vital for young peoples' civic, social and cultural participation. This is a point which is often stressed in qualitative research involving young people. How can we tell growing adults that they need to hold off becoming fully-fledged citizens and contributing members of society until they are 16?
Young people have a right to freedom of expression and access to information. They also have an implied right to political communication, which has been flagged by some tech platforms as a possible basis for legal challenges to this legislation. Social media is a vital information source and a means of expression for many members of the LGBTQIA+ community, for Indigenous Australians, for young people from culturally and linguistically diverse backgrounds, for children who live in isolated areas, and for children and young people with a disability. This bill would block young people from the channels and the groups that are their public spaces, which are the forum for their everyday communication with their people.
Young people access their news via social media. They know legacy media doesn't reflect their interests. Legacy media makes little effort to engage with them and it doesn't afford them the opportunity to communicate with each other in the way that they want to—IRL. The only exception of which I am aware to that rule in this country is that of 6 News, the streaming channel founded in 2019 by Kooyong resident Leo Puglisi. Leo has employed a number of young Australian journalists, some as young as 13, in growing the only news and current affairs channel which presents the perspectives of people their own age. Young people conduct their businesses via social media. They use it to engage with other platforms' digital services, products, apps and sites—for example, YouTube compilation videos of TikTok content, sharing a story on Instagram which then goes to Facebook and to X.
It is not easy to disentangle social media from other platforms. This is most obvious with the Chinese platform Weibo, which is integrated as a matter of course with multiple functions like posting, group chats and a digital wallet. Australians, young people, know what they don't want to experience online. They don't want to experience unwanted content or contact, or unwanted surveillance and use of their data. But they tell us they would like to see better guidelines and boundaries and to understand the acceptable use of online spaces. They want better education and training on online safety, so that they can identify, recognise and thoroughly understand potential harms. Young people have told us that, instead of reforming age-verification laws, protection should be improved, with stronger technology such as digital passwords and secure apps. They want improved monitoring and swift action, and they want accountability regarding online safety practices, rather than reversing the onus and placing it purely on the user—a user who is physically and neurologically immature.
The advice of academics and other experts has been largely ignored or misinterpreted in the framing of this legislation. The bill even misrepresents the literature it cites about developmental ages and risks. It has dismissed the literally hundreds of Australian and international experts who have raised serious concerns regarding lack of benefit and potential harm resulting from the bill. It ignores the more than hundred submissions to the ridiculously, inappropriately brief, tick-a-box Senate inquiry into the bill which was held earlier this week.
The legislation is going to rely on effective age assurance or verification processes being adopted. That means that all Australians will be required to prove their identity to access social media. Exactly how that is going to happen is completely unclear. The minister has claimed that we won't be compelled to hand over our personal identification, but I note that that statement is at odds with the explanatory memorandum for the bill.
The government has committed $6.5 million to a study of new age-verification technology. That study is due to report in 2025. But now the government proposes to give responsibility for the process over to the digital platforms. Even Google and Meta say that they think it would be better to wait for the result of the review. Age verification is rife with privacy and digital-security risks, as well as critical effectiveness and implementation issues. It is not a cure-all safeguard for children and for their data protection.
Children's location data is extremely sensitive. There is significant potential for more, not less, data harvesting from children, if we hand over to the platforms the power to verify and assure children's age. Meta has already told us that it will undertake age assurance using facial recognition, digital ID and other forms of software.
The international benchmark for digital governance, the EU's General Data Protection Regulation, clearly defines individuals as the owners of their data. It states that consent for data use has to be freely given, specific, informed and unambiguous.
In Australia, the lack of transparency of international digital platforms regarding how they collect, share and use data leaves individuals exposed to algorithms which are based on their online habits. Protections for children are dependent on different rules on different platforms.
There is a perception that regulating the collection and use of data collected by digital platforms is complex, but this is not the case. Digital platforms are predicated on rules and processes, and they can be amended accordingly. Mind you, those rules can be evaded by those children who lie about their age online and are then targeted with content for older age groups. Experience overseas, and previous experience in this country, suggests that a blanket ban will last about five minutes. Ask Stephen Conroy. Technological workarounds, such as VPNs, and false age declarations will inevitably undermine the effectiveness of this ban. The government has admitted that already.
As the Tech Council of Australia has said, this bill will only add to the existing perceptions internationally that the Australian technology sector operates in an uncertain regulatory environment which can be subjected to rapid legislative change without due consideration. This legislation does not inspire confidence in this government.
Regardless of how it's done, a ban is not going to address the root causes of online risks, and it's not going to make platforms safer for everyone. Age-gating will not make unsafe products safe. Bans could well create more risk for those children who still use those platforms, because they will remove the incentives to ensure robust child-safety features for young users who evade the age-assurance measures. Bans will not improve those products that children will still be allowed to use. What is the problem here? Is it the fact we are letting billion-dollar companies market unsafe digital products in Australia, or is it the fact that some of the Australians using those products are children and teenagers?
There are much less restrictive alternatives available which could achieve the aim of protecting children and young people from online harms. An appropriate starting point to better protect child rights online would be implementation of the recommendations of the Privacy Act review and the Online Safety Act review. Both of these looked to tackle concerns linked to the digital environment pragmatically rather than prohibitively.
To that end, yesterday I supported the member for Goldstein's private member's bill for a digital duty of care. Her proposal to amend the Online Safety Act to impose an overarching standard of care for large providers would mandate risk assessments and risk mitigation plans, mandatory transparency reporting, and stringent enforcement mechanisms. It would require the platforms to take reasonable steps to make their products safe for children and young people. It would improve online safety for everybody. Other measures to improve online safety could include regulating to limit notifications, autoplay, low default privacy settings, weak age gating for adult products, and the infinite scroll feature which is designed to promote user engagement.
We need to also help children and young people better navigate online spaces by ensuring that the national curriculum includes a specific focus on digital literacy and online safety. Young people should be taught to think critically about what they see online and how they engage with social media. Parents and teachers also need better tools and resources to help them provide appropriate guidance and support. This work will take time and effort—the sort of time and effort that the government has not put into this legislation. But that time and effort would be worthwhile.
We all know and acknowledge that social media carries significant risks for children and young people. As a parent, as a paediatrician and as a politician, I know that those risks need to be addressed by this government. We do have a duty of care: to make changes in this important area carefully and judiciously but with a mind always to the rights and the needs of young people.
In this last sitting week of 2024, with no apparent cause and no rational justification for their unseemly haste, both major political parties in Australia are rushing to ban social media for young people. Their proposals are broad in scope, but they are short on detail. They abdicate all responsibility for the program to digital providers. They are not evidence based. They risk being ineffectual, and they risk potentially significant unintended consequences. They will drive cynicism in young Australians, who will see that this government hasn't got the bottle to take on online gambling advertising but is happy to rush through this legislation without appropriate consultation or thought.
Systemic regulation would drive up safety and privacy standards on all platforms, for all children. This approach is supported by the experts—experts in mental health, digital literacy and child psychology. This approach will protect our children. Age assurance is rife with privacy and digital security risks. It's rife with critical effectiveness and implementation issues. It is not a cure-all safeguard for children and their data protection. It could well increase risk for young Australians in the digital space.
Something worth doing is worth doing properly, carefully, judiciously and in an evidence based way. We have a duty to protect the best interests of all Australians, but most particularly our young and our vulnerable. This bill will not do that, and for that reason I cannot commend this legislation to the House.
No comments