House debates
Tuesday, 26 November 2024
Bills
Online Safety Amendment (Social Media Minimum Age) Bill 2024; Second Reading
6:36 pm
Stephen Bates (Brisbane, Australian Greens) Share this | Hansard source
Prohibition doesn't work. Prohibition is not how you make social media platforms safer or empower young Australians to navigate the online world in a safe way. The Online Safety Amendment (Social Media Minimum Age) Bill is not going to resolve the root causes of online harm. It's simply not going to work. All of us want to see more accountability for social media platforms and greater transparency for users, but this bill provides neither.
Let's not forget how rushed the process has been. The government allowed for a consultation period on the bill for 24 hours. The Senate hearing for the bill was only three hours, and young people were essentially locked out from having their voices heard because of this process. We absolutely need to protect young people online, especially against the targeting, harvesting and selling of their data. A blanket ban, though, is not the way to do it.
The recent parliamentary inquiry into social media heard over and over that an age ban will not make social media safer for anyone. It's complicated to implement, and it will have unintended consequences for young people. Yet here we are again, with the government and the coalition teaming up, ignoring the evidence and ramming this bill through in the final sitting week of the year.
Why won't it work? This bill inserts a new definition of 'age-restricted social media platforms' which effectively puts a blanket ban on all social media platforms. The government has indicated that some platforms will be exempt, but we don't have the detail before us that we need to properly scrutinise this legislation. Leaving exemptions to regulations puts a lot of trust in the follow-through, and I'm not confident that it's going to be done effectively. The government themselves acknowledge this is not a perfect system. The Prime Minister has said:
We know some kids will find workarounds, but we're sending a message to social media companies to clean up their act.
The reality is that many kids, not just some, will find a way to access these platforms anyway, and social media companies don't actually have to change anything about how they apply their algorithms as a result of this bill.
The bill also doesn't prescribe how platforms will monitor and enforce an age ban. We expect this will require some sort of age assurance, not just for young people but for everyone. The government already announced funding for an age-assurance trial in the budget but has only just awarded the tender. Essentially, the trial on a fundamental aspect of this bill has not even started yet. The idea of sharing important data with these platforms whose profit model is built on selling data profiles to advertisers is a very popular idea among not just young people but across my entire community!
Let's not pretend that young people aren't going to find a way to access these platforms. They are smart; they will figure it out. I actually think of a time when I was 13, and I would come home from school and spend too much time playing on my computer; I wouldn't finish my homework. My dad installed a timing program on my computer so it would switch off after an hour, and I couldn't play The Sims for too long. It took me 10 minutes to figure out how to get around that. The answer there was to work with me to do my homework, not just ban the game that I was using.
A blanket ban on social media also runs the risk of driving young people into even less safe online spaces. When that happens, it becomes impossible to know what is going on and how to help those who need help when something goes wrong.
Many young people can have positive online experiences as well. Young people get a lot of good information and valuable social networks and support from social media, especially isolated young people or young people from marginalised backgrounds who may rely on it for their social contact. Why punish young people for the inaction of publishers and tech companies? This ban will create an Australia that says young people can go to prison at the age of 10 but can't go on Instagram until they are 16.
What should we be doing? Experts are actively calling for alternative solutions to address the root causes of the problems of online safety, regulating the platforms themselves. The government has committed to legislating a duty of care, but that is not part of this legislation. If the government can rush these laws through, why can't they implement the duty of care they promised or take measures that actually make platforms safer for everyone, like banning platforms from collecting, selling and exploiting young peoples' data?
The EU has the Digital Services Act which has a number of protections for young people, including banning the use of profiling for ads on children and implementing obligations for children's safety that providers must abide by. There are practical solutions here that we should be looking at, such as a ban of the targeting, harvesting and selling of young peoples' data, a digital duty of care platform, EU-style guardrails to limit the toxicity of algorithms and extreme content, the ability for users to turn down and opt out of unwanted content, the full release of the online safety act review and investment in education for young people and their families to help develop digital literacy and online safety skills and equip them with the tools and resources they need for positive and responsible online use. These actions would help tackle the root issues of social media. This is what we should be doing, not legislating a ban on access with the apparent expectation that everything is magically okay online the second you turn 16.
An age ban alone will not make the platforms safer or age appropriate, nor will it change the culture that informs unsafe behaviours that people are targeted with on these platforms. Instead of banning young people altogether, we need to tackle the predatory business models of the tech giants. That includes the algorithms that fuel extremism and mental health issues. The government's own online safety expert, the eSafety Commissioner, has recommended a multipronged approach that encourages platforms to be safe by design. Australia's Human Rights Commissioner has voiced concerns about these restrictions in the legislation and the rushed time frame of this legislation as well. If the government wants to protect the safety of young people, they should be looking to stop platforms harvesting young peoples' data and targeting them with algorithms and advertising to make massive profits. All users should have the ability to switch off, reset or turn down the algorithms that push unwanted content into their feed.
Privacy reforms are long overdue. The protection of users' data is vital to keeping people safer on- and offline. There is growing concern about the unabated use of users' data by tech companies to train their AIs without consent, knowledge or compensation. In the EU, the likes of Meta have been forced to provide an opt-out option for users, at a minimum. Australia must force companies to do the same here, because prohibition doesn't work; it hasn't before, and it won't now or into the future. If we are serious about addressing the issues that have arisen because of social media, then we must tackle them at the source. Guardrails, digital duties of care, stopping the targeting and harvesting of young people's data—all of these are far better approaches than a blanket ban. Young people will find ways around a ban, and then what? We may create a situation where young people are driven into even worse online spaces, and that is something that nobody wants.
As the youngest person in this chamber and one of very, very few people in this place who grew up with this technology and with social media, I can say that change is needed but this bill is not it. It shows a fundamental misunderstanding of how the internet works and how young people engage with the internet. So my message to the government is this: bin this bill, talk to young people and come back. Young people are painfully aware of how algorithms work and how they impact them and their social circles. Listen to young people. Listen to the experts. You would come back with a much better bill.
No comments