House debates
Monday, 18 March 2024
Private Members' Business
Online Safety
11:40 am
Michelle Ananda-Rajah (Higgins, Australian Labor Party) Share this | Hansard source
We are all in furious agreement in this House that we want our kids to be kept safe, whether that be online or in the physical world. Unfortunately, social media—bearing in mind that Meta turns 20 this year—has I think been overall a net negative. It has resulted in the fracking of our attention—not only our children's but ours. It has driven polarisation, it has spread misinformation and disinformation around the world and it has in fact threatened the foundations of our democracy. To add to that litany of harms, now we have the problems associated with highly porous online porn sites.
Online porn is harmful because it affects and shapes attitudes which then go on to impact behaviours. It's largely created for a male, heterosexual audience, which then leads to harmful depictions of sexuality, leading to sexism and misogyny and, indeed, it has been linked to gender-based violence. The statistics are alarming, and these come from the roadmap that was released in March 2023. Three-quarters of 16- to 18-year-olds have seen online porn, and of these a third were exposed before the age of 13 and nearly half saw it between the ages of 13 and 15.
However, the issue is: where are they actually consuming this material? As it turns out, 70 per cent of the material is consumed on porn sites, as you would imagine; however, that means that 30 per cent isn't. That 30 per cent encompasses social media sites, group chats and private group pages—for example, pages like Discord. So when they talk about initiating or piloting age-assurance mechanisms I think the coalition are talking specifically about porn sites, but you would then create a whack-a-mole problem where you may inadvertently drive audiences, including children, to these other sites: social media, Discord, group chats and private pages. You would also then drive the content to those sites. What is actually needed here, rather than a reflexive action, is a holistic approach to keeping children safe online, and that is certainly something that we, as a government, are looking at.
Age-assurance technologies is an umbrella term that encompasses age verification, which has a very high accuracy, as well as age estimation, which relies on things like biometrics of the face or analysis of the voice—which is fine if you have a voice that's not affected, for example, by an unusual accent or fluency problems. There are other ways of assessing age that may rely on tests of capacity or cognition—similar to the tests that I used to use as a medical practitioner—and then there are harder identifiers requiring documents, which is fine if you actually have those documents at hand.
There's been some interesting work looking at age tokens, whereby an age is verified but the provider is blinded to it. This token is held on a device—for example, in a secure wallet—and then used for a period of time so that personal details are not shared every time a person wants to access content. The important thing here is that there are a lot of technologies out there that have been trialled, but they aren't mature; they're still evolving. We're not going to rush in. We're doing some scoping work as to which one of these is best to deploy, and it may well be, based on the advice that has come from the eSafety Commissioner, that we will have to use a variety of options because that is in fact what consumers want.
Children are going onto these sites for lots of reasons. Obviously we want to protect them, but we also want to make sure we have tools that are effective and that don't inadvertently drive traffic to more porous sites, shall we say. In terms of a holistic approach, it's not enough to target age gating; we also have to educate people on respectful relationships, which is what we're doing, in order to create that counternarrative to the harmful norms that are being perpetrated in online porn.
No comments