House debates
Monday, 15 February 2021
Private Members' Business
Social Media Platforms
10:38 am
Anne Webster (Mallee, National Party) Share this | Link to this | Hansard source
I move:
That this House:
(1) is alarmed by the undue pain and distress experienced by Australians who are defamed, bullied or otherwise harassed on social media platforms;
(2) notes that:
(a) global technology companies which provide social media platforms inadequately monitor platforms for defamation, bullying or harassment of individuals; and
(b) global technology companies are slow to respond when complaints are made, increasing the damage to individuals;
(3) recognises that social media is a global sphere of communication in which vulnerable citizens can be unfairly targeted by individuals, with little consequence to the platform that hosts damaging content;
(4) expresses concern that current regulations do not adequately address global technology companies which control social media platforms; and
(5) calls on the Government to develop:
(a) a public regulatory framework within which decisions about removing content are made by social media platforms, to ensure community expectations around safety and free speech on social media platforms can be met; and
(b) legislation which holds social media platforms to account as publishers of the content hosted on their sites, impressing the legal responsibilities that designation entails on those platforms.
Big tech companies such as Twitter, Facebook and Google have amassed extraordinary power in the global corporate and political landscape. There is no doubt that 6 January 2021 will be remembered as a dark day in American history. The events of the day and the resulting fallout will be referenced, studied and analysed for years to come. One moment in particular—the permanent banning of Donald Trump's Twitter account—will be a watershed moment in the debate surrounding free speech and censorship on social media platforms and the question of regulation of big tech companies. Thanks to the social media platforms, we have arrived at a new reality, of 'glocalisation', where the local is merged with the global through online portals. We have become increasingly reliant on big tech companies and their services, and governments around the world have not kept pace with these transformations and their consequences. We are now working to catch up.
Freedom of speech is an inherent right that we must protect at all costs, but it is not a right to lie or incite violence. Free speech is vital to our democracy but must be limited to prevent harm. Limitations on free speech, however necessary they may be, will always be contestable. It remains a significant challenge to get these limitations right. However, the problem we face now is that big tech companies are themselves responsible for determining their own limitations. They are acting as the moral arbiters of our society, which, I argue, is the role of representative government, not a technology company. Big tech firms write their own rules and are accountable to themselves alone. This is causing serious issues for you and me as consumers.
I have personal experience of how these issues can affect people's lives. For several months in 2020, my husband and I, as well as the charity we founded to help single mothers access education, were the targets of baseless and defamatory accusations made by a conspiracy theorist on Facebook. It was unrelenting for months. Despite originating in New Zealand, the accusations were widely distributed and even reached local networks in my electorate of Mallee. My first thought was for the reputation of an essential charity that young, disadvantaged mothers in my community rely on. I was concerned that the mothers would be driven away from the service, because of these lies, and left even more vulnerable. On top of that, my husband and I were concerned for our safety, and we even installed security cameras at our home, for our peace of mind. It was an incredibly distressing time for my family.
Despite successive requests to have the damaging content removed, the posts remained public—in some cases for several months—until court proceedings got the attention of Facebook. These legal proceedings have cost us over $150,000 so far, despite a successful court case. The prospect of recouping these losses through the damages that have been awarded is also highly unlikely. Expensive civil proceedings are one of the only means of recourse currently available to people who have been defamed, bullied or harassed online. What concerns me is that many thousands of people who endure bullying and defamation online will lack the means to clear their name or protect their family. Social media enables the tarnishing of reputations and the destruction of lives, with very few avenues for justice. This is untenable, and it must change.
There are multiple ways these issues need to be addressed, and our government is working to keep Australians safe online. I welcome the strengthening of the role of the eSafety Commissioner and the provision of $39.4 million over the next three years. We're also introducing a new online safety bill. The bill includes a new adult cyber abuse scheme, which would capture abusive behaviour that has reached a criminal threshold. It would provide power for the eSafety Commissioner to direct platforms to remove abusive material within 24 hours. While these are positive measures, I believe further steps need to be taken. Social media platforms need to be held to account as publishers of the content that is hosted on their sites. If a newspaper, radio station or TV channel defamed an individual or incited violence through their publications, they could be sued or prosecuted under the full extent of the Australian law. At this point, the same is not true for Facebook, Google, Twitter, Instagram and other social media platforms.
The business models of social media giants are very similar to those of traditional news media, yet the rules governing print media, radio and television are vastly more proscriptive than those that apply to digital platforms. Traditional news media are held to a much higher standard under the law, which puts them at a commercial disadvantage to digital platforms. This isn't fair, and it doesn't provide for a competitive media industry. Social media giants hide behind the excuse that they are nothing more than a virtual town square and therefore they can't be held responsible for anything that is shouted out. But the fact is that the technology and algorithms that underpin these platforms are incredibly sophisticated. The platforms show you what they think you want to see. They are designed to keep you engaged for as long as possible. These facts alone demonstrate that big tech companies are making editorial decisions regarding the content that you see on their platform.
If big tech companies want to preserve their power to moderate and promote content on their sites, they need to be treated under the same legislative framework as traditional news media and held to account for the consequences of hosting damaging content. In addition, the government should pursue the creation of a public regulatory framework to guide the moderation of content on social media platforms to ensure community expectations around safety and free speech are met. I understand that work is under way to develop a voluntary code along these lines. At the government's direction, DIGI, an association representing the digital platforms industry in Australia, has developed a draft code. The Australian Communications and Media Authority is overseeing the development of the code, and I hope it's found to be sufficient to address the challenges we face regarding disinformation and defamation. Both measures—treating big tech companies as publishers and the introduction of a code of conduct—are essential. Holding big tech companies to account as publishers would provide an incentive for these companies to follow the code of conduct, thereby ensuring the decisions they take to moderate and promote content are in line with community expectations. I know the minister for communications will consider further measures should the voluntary code prove inadequate to address the problems we face.
Recently the member for Newcastle, Sharon Claydon, and I formed the Parliamentary Friends of Making Social Media Safe group to continue the important discussion around online safety. We are thrilled by the reception of the group so far. Seventy-five members and senators from both sides have joined the group already, which I think displays the interest and concern that so many have regarding social media. One step I've taken in my electorate is to inform young families in particular about dangers online. The internet is not just a harmless space for kids to watch YouTube. Consequently, I've drawn up a handout for the residents of Mallee about the importance of online safety. It's hot off the press and will be sent out this week to all residents in Mallee.
The progress made in global communication and interconnectedness, thanks to social media platforms, has been remarkable. With this progress comes a responsibility to ensure that people are safe when using these platforms. I am focused on fighting for change to ensure our kids and grandkids are safe online and that our society has a healthy relationship with social media, going forward.
Trent Zimmerman (North Sydney, Liberal Party) Share this | Link to this | Hansard source
Is the motion seconded?
Julian Simmonds (Ryan, Liberal National Party) Share this | Link to this | Hansard source
I second the motion and reserve my right to speak.
10:47 am
Ed Husic (Chifley, Australian Labor Party, Shadow Minister for Industry and Innovation) Share this | Link to this | Hansard source
This is a serious issue, and I completely understand the member for Mallee bringing this forward. The member for Mallee has been subjected to unacceptable behaviour online. I know what it's like, as a member of parliament myself. When social media started to get its foothold in this country—MPs were using it and in particular where the public was using it—I felt the full force of this unacceptable behaviour, which was particularly focused on my faith. Certain people online used my faith, at various points, to press their case. It's very Islamophobic behaviour that many of us have felt. I know the member for Cowan has gone through the very same thing. In my case, it happened many years ago. The member for Cowan can testify for herself as to what she has been through. It doesn't feel good. You particularly feel it as a member of parliament because you've got a public persona. The people that are close to you feel it, but also the people who are in a similar situation feel it really badly as well. It is not a nice thing to go through.
It has not just been with some of these forums. It's been going on ever since the first virtual online communities were created, back in the mid-eighties. For example, the Whole Earth 'Lectronic Link—WELL, as it was known—was the first of this type of community. Created in the US, it was designed to connect people and, in their view, enable much better ways for people to relate to each other. They thought it would be a positive move. Sure enough, they soon found that it became very problematic. People, for some reason, felt like they had a licence to behave in a way that they would never do if they were physically in the person's presence. In person, they would never say those things to other people in that way and conduct their affairs in that manner. So there is something to be said about going online and treating people in way in which we would not do if we were in their presence. It is an issue, and it's something that we have not really been able to deal with.
The member for Mallee referenced some of the legislative things we could do here. In the US, a lot of these firms are given licence to not be held responsible for what appears on their platforms through a thing called section 230 of the Communications Decency Act. So there is an active proposition being chased by some people in the congress, like Mark Warner, to deal with this. But it is a big issue. The thing is, Facebook, Google and Twitter didn't create Islamophobia, homophobia, racism or sexism but they've certainly helped give voice to the legitimacy of the views, and some of the material they've taken off has just been staggering. For example, in the stats for the final quarter of 2018, YouTube removed nearly 19,000 videos and 2,000 channels for violating its hateful or abusive content policy, and in the third quarter of 2018 Facebook took action on nearly three million pieces of content that violated its hate speech policies and 15.4 million pieces of violent and graphic content.
Good on them, because when I raised some of the stuff that occurred previously it took a hell of a long time for anything to happen. I do commend them for taking it, but it has taken a while. A lot of the online platforms have said it's too hard to know what's on their platforms and to deal with. I think a lot of people would believe that is hard to stomach or accept. They do have a capacity to do more, and they should—absolutely—do more. There is still content on pages of these platforms that is absolutely disgusting and should be tackled. Last year, for example, I received a letter from a community organisation that had referenced a University of Victoria report on online Islamophobia and fascism. The report found that although Facebook algorithms were designed to delete posts containing profanities in them they don't necessarily pick up on the insidious nature of some extremist material. That report analysed just over 40,000 posts from 12 far-Right Facebook groups with posts that strongly targeted people on the basis of religion.
I haven't heard the government deal with that, not at all. I haven't heard them come in and say it. I respect the member for Mallee has been targeted, but we should not have to wait for action because one government MP was targeted. Lots of people in the community have been targeted by this and we have not seen any action out of the government. Worse still, this parliament held an inquiry into this issue in 2018, into the adequacy of existing offences in the Commonwealth Criminal Code and of state and territory criminal orders to capture cyberbullying. It made a number of recommendations, including: placing regulatory pressure on the platforms, to both prevent and quickly respond to cyberbullying; that the Australian government legislate to create a duty of care, on social media platforms, to ensure safety; that the government increase basic online safety requirements for social media services; and to ensure that the Office of the eSafety Commissioner is adequately resourced to fulfil all its functions.
We've contacted the committee secretariat and the tabling office on this inquiry. On 28 March it will be three years since this inquiry was tabled with the government, and there's been no response. So we've got this resolution now, we've had all these cases where concerns have been raised, yet nothing has been done by the government. And it's good—
Dr Aly interjecting—
Exactly. I'll take that interjection from the member for Cowan, because it wasn't them. For ordinary people in the community who are targeted for a whole host of things, in terms of racist and homophobic language, those issues aren't taken seriously. The government has a report on this. They've been asked to deal with it and they don't. This is the real issue: it shouldn't take one government MP to be affected before this—I totally respect and understand how much money the member for Mallee, as she detailed in her contribution, undertook. She should not have to go—no-one should have to pay $100,000-plus on legal fees to deal with this. It should be taken seriously.
The other thing is it's not just the platforms where this has become an issue. The platforms give a space for this to rise. What I want to know from the government too is why don't they take this stuff seriously at its source, not just the platform but at its source? How come, whenever we have raised the issue of far Right extremism, which has driven a lot of bad behaviour online, it has taken ages for it to be responded to?
We only got an inquiry into far Right extremism when the government were happy that it didn't just target far Right extremism and didn't mention it. It could only be referenced as 'extremism'. Yet we've been saying for ages that this is an issue. The agencies have said, 'This is a problem.' They are concerned about word transforming into deed and impacting on people's safety, and we've had no serious commitment out of those opposite as a government. A responsible government would take this seriously. They would absolutely treat this seriously. They would go to the source, not just to the platform that creates the environment for this hateful stuff to be said. They don't take it seriously.
If the government were serious not only would they respond to reports saying, 'Treat cyberbullying, hate speech and this terrible online behaviour seriously,' and deal with that but they would also treat seriously this issue of far Right extremism. I don't care if it is an Islamist or a far Right extremist; anyone who threatens the Australian public should be dealt with forcefully. But it seems to me that it takes a hell of a long time to deal with far Right extremism. With Islamists we've seen proscription of groups. We've seen a hell of a lot of action on banning groups that don't even operate here if they're Islamist. But, if they're far Right, it takes a hell of a long time for anything to happen.
The minister here is getting uptight about it. But do you know what makes me uptight? I don't like it when I see people using Nazi salutes in protests in Melbourne. I don't like it when I see swastikas being held up in people's homes. I don't like it when we hear of those groups collecting ammunition and weaponry. And I don't like it when we have a government minister in the form of the home affairs minister who cannot mention far Right extremism without having to also reference Antifa, as if you can only acknowledge it if it is balanced out that way. It's wrong. People deserve to have their safety taken seriously. (Time expired)
10:57 am
Julian Simmonds (Ryan, Liberal National Party) Share this | Link to this | Hansard source
I'm very pleased to rise to support this motion from my good friend the member for Mallee. I also find myself agreeing with a fair bit of what the member for Chifley has said about the importance of this issue. It's the second time I've found myself on a unity ticket with the member for Chifley, after his comments on aviation. It's becoming a bad habit for me to agree with you, Member for Chifley!
But I wanted to particularly rise and express my support and my great concern about this issue as it goes for me and my community. Being one of my generation, I've lived much of my life with social media and it being a presence in our everyday lives. I want to preface my remarks by saying that I am obviously, as most people in this chamber are, a strong believer of freedom of speech and the freedom that we have to express our opinions. However, there needs to be a line drawn when it comes to issues that pertain to safety and there has to be a reasonable discussion about a level of accountability that has to be taken by social media companies as publishers.
Social media is so powerful and is such an opportunity for us to stay connected, particularly during this time of COVID when we were physically separated. It has been so odd. But with great power, as the saying goes, comes great responsibility. Right now, these social media companies are taking no responsibility over the platforms that they have created and are being used for these purposes. The attitudes of Australians are changing. When social media started, the attitude of Australians was that they were willing to accept a lot of downsides for this amazing connectivity that was created. But Australians are now realising—and in my electorate I talk to people who see it all the time—that they don't have to accept this anonymous rubbish that's being hosted by social media. They do not have to accept as a trade-off for connectivity that their kids can be groomed online and that it's so hard to police. They don't have to accept that people will say things on social media that they would never say to your face or act in a way that they would never act in real life.
Commercial television broadcasters operate according to the Commercial Television Industry Code of Practice, developed by the industry. But social media has no such code. Frankly, we need to move faster to level the playing field with social media companies. As the member for Mallee said, social media companies need to be regulated as publishers. If they host material on this platform then people need to be able to pursue damages against them, just like in the non-digital world. It's not that much to ask. It's just that the same rules that are in the real world should also apply in the digital world. It's not difficult. These social media companies have enormous resources. The opportunity is there where, if somebody sees anonymous content hosted that they feel is defamatory, harmful or something like that, they should be able to tell Facebook or one of these other social media giants, and there should be an expectation that, if they don't remove it within 24 or 48 hours, they are agreeing to publish it and they can be held responsible as a publisher for the negative effect that it has. They have that responsibility as traditional publishers.
The most worrying use of all social media is obviously in the context of the incitement of violence, child exploitation and child sexual abuse material. As a young dad, I'm obviously very passionate about preventing this. I recently visited the ACCCE centre in Brisbane, established by this government to counter child exploitation. One of the things that was raised with me by the incredibly dedicated individuals that are there trying to track down this material is the barrier that social media platforms create when trying to find, stop and prosecute these vile individuals. Did you know that Facebook and these other social media giants are objecting to one in every five lawful requests that our law agencies make to have access to individuals who are grooming or posting child exploitation material online? Just think about that: 20 per cent of lawful requests from our law agencies to these social media publishers to access the details of these people who are committing the most heinous crimes are rejected. What possible excuse could there be for that? It's simply not enough.
The Morrison government is acutely aware of this. It is doing an enormous amount in terms of Minister Fletcher's online safety bill and in terms of resourcing the eSafety Commissioner, which was never there before. But we have more to do, and we need to hold these social media companies to account as the publishers they are.
11:02 am
Sharon Claydon (Newcastle, Australian Labor Party) Share this | Link to this | Hansard source
I'm very pleased to make a contribution to the debate today on the wide range of threats that are being enabled by social media platforms: threats to individuals, threats to communities and, indeed, threats to the very fabric of democratic systems like ours. I thank the member for Mallee for raising this important issue today and I thank her for reaching out to me to establish the bipartisan group of Parliamentary Friends of Making Social Media Safe, a group designed to put these issues very firmly on the national agenda. We will be launching the Parliamentary Friends of Making Social Media Safe next Tuesday with a breakfast in the Mural Hall, and I encourage all members to come along and join in the discussion on this important issue.
Upfront, I would like to recognise that social media has given us some very positive things and enriched our lives in ways that we couldn't previously have imagined. Through it we have found lost friends, fostered new connections and shared knowledge, free of many of the constraints that exist in the physical world. Indeed, in these COVID times, it has helped bridge the tyranny of distance for many.
But this remarkable reach and ubiquity also has a darker side because it also created fertile ground for some serious threats to flourish: for individuals to be defamed and their reputations sullied; for vulnerable people to be bullied, harassed and exploited; for coercive control to be utilised to instil fear in women; for entire communities to be maligned, victimised or persecuted; for the amplification of hate speech from far-Right extremist groups; for the glorification and incitement of violence; for dangerous misinformation to spread like wildfire; and for democratic processes to be undermined and subverted. These things matter and these threats are real, but at the moment there are precious few avenues for redress when people have been wronged. For their part, the social media companies often behave like outlaws in the new digital Wild West: answerable to no-one and not responsible for so many of the harms they are enabling.
These platforms can and do moderate content on their sites, so we've moved well beyond the idea that they are merely passive, neutral conduits of information. But they have set themselves up as global entities, effectively, outside all jurisdictions, dodging scrutiny and accountability all too often, and we've seen the ills that have resulted. Enough is enough. Social media companies need to take responsibility for what their platforms have unleashed. They are well resourced and they have an obligation to the communities they currently exploit for profit.
However, government has a responsibility too. The current hotchpotch of laws and self-regulation clearly is not enough, and indeed dominant platforms have even gone so far as to call for governments to regulate them properly. So what is taking this government so long? Frankly, the regulatory environment is a mess. The Morrison government talks a lot about the online safety act, but the fact is that that act still does not exist, despite all the talk. Similarly, the disinformation code still isn't in place, and I fear it won't do nearly enough, given that it's only voluntary and doesn't address misinformation as the regulator said it must.
I, along with all of my colleagues on this side, welcome the eSafety commissioner and some additional resourcing that has gone to her, but let's not underestimate the enormity of her task. We know that recent global studies have shown that social media is the new frontier for gendered violence and we have witnessed the rapid escalation of it during the COVID-19 pandemic, which is chilling to say the very least and should ring alarm bells for everybody in this House. Some 65 per cent of girls and young women surveyed in Australia have been harassed on social media. If that is not pause for thought, I'm not sure what is. But, given the incredible reach of social media, there is a lot of work to do. I thank the member for Mallee for bringing this forward. It's just the start.
11:07 am
George Christensen (Dawson, National Party) Share this | Link to this | Hansard source
I don't think most people want to live in a world where you can only post things that tech companies only judge to be 100 per cent true. I believe we should err on the side of greater expression. They're not my words; they're the words of Facebook founder Mark Zuckerberg from 2019. Moreover, he went on to say this little gem in a speech in front of Georgetown University students in 2019: 'I don't think it's right for a private company to censor politicians.' We had Jack Dorsey of Twitter, tweeting out to us that 'Twitter stands for freedom of expression'—that was in 2015. Oh how times have changed where we now have these big tech social media conglomerates that are censoring political speech all over their platforms. I for one do not think that is right. It is an attack on democracy itself. It is an attack on free speech.
If you can censor and deplatform the leader of the Free World, you can do it to anyone. There's an old saying, 'Whoever takes down the king, becomes the king.' These social media giants, these big tech corporations, are now way, way too powerful. They control the new town square, the new public forum, where political discussion, all sorts of discussion, goes on. Of course we should clamp down on speech that is harmful, defamatory, and illegal, and where the content is actual hate speech—not just speech that we hate, but actual hate speech; there is a difference. We should clamp down on all of those things that would not be allowed in a newspaper. But the problem is these big tech companies have gone far too far. They now are clamping down on political discussion, political commentary, political views, and they're also putting in these so-called fact checkers who give the impression that a fact you have posted is wrong when it is a fact and it is correct. They do that by saying it's missing context. Tell me which news story isn't missing some form of context! So this is a very, very big problem for democracy.
Unlike other speakers in this place, I do not wish to further censor these big tech companies. I do not wish to censor what people are saying on social media platforms. But I will concede this: they're actually now no longer platforms; they're publications. They're censoring speech that is lawful on what were once platforms and are now basically online publications. That is why, if these big tech companies don't bring themselves back to the point of being social media free speech platforms, as they were originally intended, then they probably should, as the member for Mallee has suggested, be liable for defamation. They've basically become publications now, publications which have editorial guidelines—they call them community standards, but now they're basically editorial guidelines. They're major publications which have a whole heap of volunteer contributors. If that's the model that they want, then that is what they're going to have to live with, and defamation and their liability for defamation are going to have to be part of their business as usual.
I wish though that they would just go back to what they were originally intended to be: platforms for free expression. I don't think we're going to see that, but I've got to say that, whether we make these social media giants go back to being platforms or whether we say, 'You're now acting like publications and so you should be liable for defamation,' they should be the focus of government. We shouldn't simply have legislation which makes these big tech companies pay big news companies. I don't want to see big tech corporations paying big news media corporations. What I want to see is legislation here protecting the little guy, protecting the average Australian, and that is what this motion is calling for.
11:12 am
Susan Templeman (Macquarie, Australian Labor Party) Share this | Link to this | Hansard source
I welcome the opportunity to speak on this issue and support the comments made by my colleagues the member for Chifley and the member for Newcastle. I really want to speak about the challenges faced by small-business women as they conduct their business online, especially on social media. Prior to COVID, many actively avoided social media for business purposes, but no-one could help but get online during COVID. What this has led to is a rise in anxiety about what comments will be made about them or their business or their competitors or their customers as the social media tragics target them. All of us as MPs would be well aware of the tactics—people hiding behind sometimes numerous fake Facebook profiles, sometimes using their own name but not often; comments appearing in the late hours of the night or the early hours of the morning, never alone but always with a couple of others happy to pile in, at a time when they know neither we nor our staff can monitor. Their skill is in inciting others to engage and respond to their nonsense. Some people just can't help themselves. As MPs, we recognise we'll be targets, and I know we do our best to deal with those comments, many of which I now take a hard line on and choose to delete or hide because I want people to feel they can have a rigorous but respectful discussion around policy on my page. But it is hard to get to them all. That's us with our resources. What worries me is that small and micro businesses are really feeling the pain.
I've discussed this with members of women's business groups, and the level of anxiety about how to deal with it is through the roof. Some have reported being afraid to post in groups for fear of the response it will trigger. Others are concerned that conversations on their page can quickly escalate way beyond the original topic and they're just not sure how to manage it. The eSafety Commissioner, Julie Inman Grant, released a report nearly a year ago showing a 40 per cent increase in reports of online abuse and cyberbullying in just the first few weeks of COVID compared with the previous 12-month weekly average. The commissioner has said she believes the increase in online harm is unlikely to go away.
For me it feels like a full-time job keeping on top of it, but for small business this is not something they should have to put up with. I'm not waiting for legislation to support these women in business in coping with the onslaught. It is a wait, by the way. The government promised to bring in legislation last year, but instead, two days before Christmas, they released a draft—two days before Christmas. It was almost as if they didn't really want feedback on it. The submissions closed on Valentine's Day. If anyone in my electorate missed out on putting in a submission, please send it to me, and I will make sure it gets to the minister.
The draft legislation provides the eSafety Commissioner with some additional powers to unmask internet trolls, but, as I said, I am not waiting for this government to legislate, because we've waited far too long. Working with the Office of the eSafety Commissioner, I'm running a workshop next month to empower women in business in my electorate to deal with cyberabuse and cyberanxiety. As Ms Inman Grant has said to me, 'The gendered nature of online abuse is an issue we've been grappling with for some time.' She's expanded the Women Influencing Tech Spaces program to offer in-depth social media self-defence as a way of tackling this issue head-on.
I'm very pleased to be joining with the Office of the eSafety Commissioner to provide advice and practical support for women in business in my electorate, for whom social media is a vital professional tool. I'm partnering with Women with Altitude and the Hawkesbury Women in Business group, with invitations to go out through the chambers of commerce. It's open to all women in business across the electorate of Macquarie through the Blue Mountains and the Hawkesbury. This is something I hope will help women be better prepared to recognise online harassment and cyberabuse; to know where and to whom and when to report it; to help them make decisions about whether to respond and the best ways to do that; and to be aware about how online harassment can affect wellbeing. This will be a really practical forum with practical advice and an opportunity to discuss the things that I know women in micro and small business are dealing with every single day. The online workshop will be from 6 pm on Tuesday 9 March, and all of the details and RSVP links will, appropriately, be on Facebook. So look out for it. I urge women to join us. We should be using social media as a tool that lifts up the businesses that we work in and get one step in front of the people who are trying to pull us down, and that's what this workshop is aimed at.
Trent Zimmerman (North Sydney, Liberal Party) Share this | Link to this | Hansard source
The time allotted for this debate has expired. The debate is adjourned and the resumption of the debate will be made an order of the day for the next sitting