Australian schools are increasingly using artificial intelligence tools for assessment and teacher assistance; to help provide individualised feedback to students and create new ways of learning. However, the take-up has been patchy, says Therese Hopfenbeck, director of the Assessment & Evaluation Research Centre (AERC) at the University of Melbourne, adding: “From a research perspective, we are very concerned about the increasing gap in AI literacy for teachers and students”.
Knowledge varies across the education landscape regarding which AI tools are the most efficient and helpful and “ethically good to use”, she says. She has been amazed by the “excellent work” with AI in some schools, but “then you can go to other schools, and you will find teachers who do not interact with these models at all”.
AI educational tools that are ethical and ethically available for use in the classroom can be a very efficient support, for both students and teachers, Professor Hopfenbeck adds, but a broad understanding of how to assess the quality of these tools is lacking.
Independent Schools Australia (ISA) chief executive Graham Catt says choosing suitable AI educational tools to use in schools requires time and effort. “Any tools and programs used in schools need to be safe in terms of privacy, have educational integrity and rigour and be aligned to the Australian context,” he says.
Schools have told ISA that they often feel “bombarded” by Australian tech companies spruiking their AI educational products.
“They are understandably cautious in assessing them,” Catt says. “Teachers and students have differing levels of GenAI literacy. For some, developing this literacy involves professional learning and takes time. At the same time, for the many students and teachers who are already using GenAI, schools need to develop the appropriate guidelines and guardrails.”
Since the US-based tech company OpenAI launched ChatGPT in November 2022, GenAI has surged across industry, business, education, health and the philanthropic sector. Doctors use GenAI to write their medical notes; companies use AI to monitor inventory; AI-dubbed films are screening in cinemas, and an independent school in London last year reportedly announced an AI platform would teach a group of its GCSE students (with three teachers, or “learning coaches”, overseeing the students).
In Australia, many students and teachers are already individually engaging with GenAI via a range of tools and platforms, Mr Catt says, and schools are working hard to provide support and guidance.
“There is a lot of work happening behind the scenes to support all schools and sectors,” he says, “but it takes time to develop and then to go through the necessary government processes and quality assurance.”
While coming to terms with the logistics of finding, introducing and monitoring suitable AI educational tools, schools must be aware that “human capacities such as social connection, critical thinking and creativity should be protected and not diminished by a machine-driven world”, Catt adds.
Hopfenbeck says some educators have expressed concern that genAI will primarily be used to cheat and will eventually sap all intellectual rigour from any given field of study. She says that’s an “old-fashioned view” and she thinks differently.
“There are teachers who don’t come with that attitude, but instead, they are changing the way they are assessing,” she says.
“So a classic example is, for instance, instead of having an assessment, where you ask students to write an essay, you provide an essay to students, and you put students together in a group, and you ask them to discuss how they can improve the essay using an AI chat bot.
“If you have teachers who use this at the right time in the right context, it actually pushes critical thinking. It pushes problem-solving skills.”
Hopfenbeck says Australian schools are currently using genAI for both teaching and assessment, and she has met teachers who have undertaken their own independent research into genAI models.
“You will see curious teachers across the board who are experimenting, who are reading, who are in networks globally, where they are sharing how they’re using AI,” she says, adding these networks are popping up around the country.
“This is so complex that we can’t work alone. Those who are doing the best are those who are working together in networks, sharing knowledge and supporting each other.” Educators should make sure that research drives genAI educational tool choices, rather than tech companies looking to boost their markets, Hopfenbeck says. “Because when we don’t have a structure around these issues, many of us are really concerned that it will be in the hands of those with economic interests and not students’ interests.”