House debates

Wednesday, 27 November 2024

Committees

Employment, Education and Training Committee; Report

11:38 am

Photo of Cassandra FernandoCassandra Fernando (Holt, Australian Labor Party) Share this | | Hansard source

Generative AI has experienced extraordinary growth in recent years. From its origins in the 2010s as a tool for generating simple chat responses, it has now evolved into a sophisticated technology capable of producing videos, images and long-form essays.

Today, gen AI is being used extensively in workplaces, educational institutions and even by some within this very building. This technology is not a trend; it is reshaping the way we work, learn and interact. It is rapidly becoming embedded in workplaces, education systems and society at large. As a nation, it is crucial that we stay ahead of this technological curve.

Our government must establish clear regulations, guidelines and expectations to govern its use responsibly. This will safeguard Australians and ensure that we fully harness the potential of gen AI to enrich our society. Equally important is preparing students to engage with this technology safely and effectively. Educators must be equipped with the tools, knowledge and training to teach AI literacy meaningfully. Integrating AI into classrooms should enhance learning while preserving the critical thinking and creativity that underpin a quality education.

The House of Representatives Standing Committee on Employment, Education and Training recently concluded its inquiry into the use of generative artificial intelligence in the Australian education system. This inquiry began in May 2023 and involved 15 public hearings across Canberra, Melbourne, Sydney and online. It received over 100 submissions from stakeholders including state governments, universities, professional associations, trade unions, experts and private individuals. The committee's work resulted in the report Study buddy or influencer, which outlines 25 recommendations. These recommendations address the risk, opportunities and best practices with genAI in education, offering a clear roadmap for its responsible integration into our schools.

One of the report's key recommendations is for the Australian government, in collaboration with state and territory governments, to develop and fund a comprehensive implementation plan. This plan would deliver training for teachers, support staff, students, parents and policymakers to use genAI effectively. Educators need professional development through virtual and in-person training modules to ensure they can integrate AI into classrooms responsibly. The report also highlights significant risks associated with genAI.

Data protection is one of the most pressing concerns. Many AI tools collect and store the information entered into them and harvest data from across the internet. This raises serious privacy issues, especially for students who may unknowingly share personal information that becomes public and is owned by developers. Another critical risk is the growing digital divide. Low socioeconomic students, who may lack access to computers or school provided laptops are at a significant disadvantage compared to their wealthier inner-city counterparts. If we are committed to equity in education, bridging this divide is essential. All students must have access to the necessary technology as well as the training and support to use it effectively.

Australia is not alone in addressing these challenges. Countries such as the EU, Canada and the US are grappling with similar issues. Engaging with our international partners will allow us to develop consistent guidelines to share the best practices. By working together, we can create a global framework that supports the ethical and effective use of genAI in education. GenAI holds immense potential to enhance education, but it must be approached with foresight and care. The Study buddy or influencer report provides Australia with an opportunity to lead the way in integrating this technology into schools to benefit all students, regardless of their backgrounds. I would like to extend my thanks to the chair, the member for Bendigo, Lisa Chesters MP for her leadership throughout this inquiry and to the committee secretariat for their work in producing this report.

11:44 am

Photo of Zoe McKenzieZoe McKenzie (Flinders, Liberal Party) Share this | | Hansard source

I rise to speak about the House of Representatives Standing Committee on Employment, Education and Training's inquiry into the use of generative artificial intelligence in the Australian education system and its final report Study buddy or influencer, released last August. I echo the comments of the member for Holt for the exemplary leadership by our chair and the enthusiasm and intelligent contribution of our colleagues and, indeed, the secretariat. I do start, though, by thanking the Minister for Education, the member for Blaxland, for referring this essential question to the committee after some overly enthusiastic advocacy for it from a nameless coalition member of the committee, who will not be named in this place.

When it comes to the future of our education system and the impact generative AI will have on it, we need to be bold, imaginative and optimistic. That is not where I started in this inquiry, and it is certainly not where I ended. Two years ago I stood in the main chamber and delivered my first speech, and I feared a demise in education standards as devices dislodged books in our education system and increased the incidence of concentration difficulties, impulsivity, sleep disturbance and poor language development. When ChatGPT was released onto the global market in November 2022, I feared that my most pessimistic predictions were all too proximate.

But in April of last year, 2023, I went to Europe and met with the leading public policy analysts in this space, Tia Loukkola and Andreas Schleicher from the OECD, and later that year in Belgium I met with the authors of the EU's Digital Services Act and those who were holding the pen on the EU's generative AI laws. I met with organisations whose job it was to equip and upscale the European teaching workforce to deal with technology and generative AI and, specifically, large language models.

So my fear about falling education standards was replaced by excitement about what it might mean in the Australian classroom. Our children are not performing well. Look at our PISA results. Between 2003 and 2022 Australia 's mean performance decreased by 37 points, which is almost 10 per cent, and we now have the most disrupted classrooms in the world. Equally, our teaching profession is in crisis. We need thousands more to come into the profession, and we need those who are already in to stay. Quite a number who go into teaching degrees these days have barely passed their own year 12 finishing certificate.

For reasons we all struggle to understand, teaching remains a poorly regarded profession in Australia, despite the enormous impact it has on every single one of us and the sensible investment in teacher attraction innovations like Teach for Australia, whose graduates are making a great impact across the country, especially in Western Port Secondary College in my electorate. There would not be a person in this place who is unable to identify a teacher who made it possible for them to stand here today. In my case that teacher was Mr Andrew Barnett. He taught me Australian history and economics, and he is the reason I stand here today. And without a doubt, everyone here across the parliament could name, in a millisecond, that teacher who changed their life. Yet it is estimated that by next year Australia will be short by at least 4,000 secondary teachers.

As this inquiry showed, generative AI gives us an ability to reimagine the education model from the bottom up. In an opinion piece that I wrote for the Financial Review a year ago, I imagined a world in which a hologram would appear in the classroom. It might be a hologram of Oscar Wilde, or it might be Shakespeare, or George Orwell. It would be imbued with the wisdom, writings, research and imagery of the time and be able to converse with schoolchildren in a language they understood as though the Bard were really there. At the time, even I felt as though I was describing a wonderland—except that a few months later Loughborough University in the UK did exactly that: it brought into a lecture hall a global expert from somewhere in the United States through Proto holoportation technology, to interact with students as though their lecturer were not thousands of kilometres away.

Recently I was listening, as I do almost every night—we are sad people in this place!—to a podcast about technology and how it is changing our lives. On Saturday night I listened to an episode of Your Undivided Attention, probably the best podcast for analysing tech for good, tech for bad and tech for terrible, as I like to think of it. In this episode psychologist Esther Perel described how someone had built a chatbot of her. Frustrated that he had been unable to secure a consultation appointment with her directly, he sat down and invented a large language model filled with all of her words, her teachings, her podcasts, her interviews and any other material he could find. 'AI Esther' didn't have trouble setting up an appointment. In Perel's own words, AI Esther 'fundamentally helped' the man, and the advice and consultation was 'illuminating' and gave him 'tremendous peace'. He did not need the real thing anymore. The bot had just enough wisdom to help get him through a difficult time.

I'm absolutely not recommending this model for psychology—not one bit. I am a big fan of all the messiness of human relations and the need for other human relations to help us make sense of them. But I'm not so pure when it comes to teaching, and we must recognise that this model has the potential for teaching our children—at least in part, at least the basics. You see, a large language model teacher designed for each particular child could move at the pace of that child, understand the proclivities of that child, and encourage or cajole the child in a way that would actually be effective and get them back to the learning table—because these devices already know our children better than we do. In this way, generative AI and edtech can be a huge and beneficial resource for our schooling system, but I hasten to add most forcefully that education cannot and must not just be about devices. Education must remain both a social and a human activity, and it must always be thus if we are to equip young people to make a contribution to this world. The edtech devices should be allowed to do what they can do better or what they can do in an environment where we just don't have the adequate skilled teacher workforce that we need, but we must preserve the human side with as much, if not more, expertise.

Throughout the hearings of this inquiry, I asked a number of our witnesses, 'Could you just tell me what human attributes you think we're still going to need in society in five, 10 or 15 years time?' It's a question I asked of my European interlocutors as well. Will critical thinking still be important? Will grammar and mathematics still be important? Will young people need to know geography? Will people need to know how to speak another language? Generative AI could wipe out these skills in less than a generation in the same way that ubiquitous smartphones have affected younger people's ability to read an analogue clock or to read a map. Throughout this inquiry, I learned that this notion has a discrete term in the context of generative AI: it's called 'cognitive offloading', meaning just parking some of those skills, tasks or knowledge that we simply don't need anymore—until we do. I fear that, without some degree of vigilance, cognitive offloading will take with it some of our creativity, our innovation, our resilience, our lateral thinking and, most concerningly, our wisdom.

In this country at this point, most of the public policy settings and alterations we are implementing regarding generative AI are being done as the AI plane flies through the sky. We are approaching it as we would any other education tool, like it's the calculator, the computer or the compass, but it is like none of those. Generative AI will change every aspect of the workforce. It will change every aspect of our learning journey as well as the skill sets we need at the end of it. Half of our current jobs will lose their tasks to large language models. The question is whether we recognise that in time and whether we guide generative AI, especially in education, for good.

One of the recommendations of this report is to maintain Australia 's competitive edge in this domain. Attending the TEQSA conference last year, I learnt that Australia had actually benefited enormously from the fact that ChatGPT was invented in November 2022, giving our university sector time to prepare for the academic year. That is why in recommendation 25 of the Study buddy or influencer report the committee recommends that the Australian government establish a centre for digital education excellence, modelled on the existing CRC centres, which would act as a thought leader in relation to both use and development of generative AI in school and university settings. I note that that recommendation was equally picked up and reiterated in recommendation 3 of the coalition members' additional comments to the Joint Select Committee on Social Media and Australian Society's final report, issued last week.

I thank for their remarkable work Jason Lodge at the University of Queensland and Danny Liu at the University of Sydney, who are nothing short of visionaries in this space. We need to support and sustain not just their imagination but their practical exploration of what is possible within the Australian education system using this technology, this intervention and this revolution for our national benefit. There are many models to ensure that we can capture such thought leadership, and this report suggests we need to look at them with haste to make sure Australian minds shape the development and adoption of best practice in generative AI in the Australian education system.

This report, like the social media report released last week, provides a pathway to ensure Australia has a clear plan for promoting tech for good, managing tech for bad and eliminating tech for terrible.

Debate adjourned.