Important lessons on game-changing chatbot

When ChatGPT burst on to computer screens around the world in late November 2022, Michelle Dennis knew the artificial intelligence chatbot was a game-changer. “It very quickly became apparent it would have an impact on classes,” says Dennis, head of digital at Haileybury independent school. “We knew we needed to have a policy before school year started.”

Now, 18 months down the track, she and her colleagues at the Melbourne-based school have shaped a curriculum that includes lessons on how best to write prompts for generative artificial intelligence and the importance of verifying AI-generated text, and they are sharing AI policies and principles with schools across Australia.

“It’s a time where we need to be open and support each other because it’s changing all the time,” Dennis says. “I would like to see more school leaders engaged in that conversation and looking at how we can create those safe playgrounds for students to learn those tools they’re going to need for the future.”

Developed by US artificial intelligence company OpenAI, ChatGPT can seemingly work magic with generative AI and produce articulate answers to any request tapped into a question bar. The bot can write lyrics in the style of Bob Dylan, or a speech in the manner of Martin Luther King. Later iterations can follow text commands and produce an image of, for instance, a bloodied fighter or spring blossoms in the snow.

Free, easy to access and easy to use, ChatGPT and similar AI chatbots have immense potential for teaching and learning, but they can also help students cheat by generating text or images for them or coming up with ideas for projects.

With four campuses in Melbourne, an online campus, a school in Darwin, and a partner school in China, Haileybury decided to embrace the chatbot and teach students from Year 8 upwards how to best use the power of this new nimble and multi-faceted tool.

The school uses Microsoft’s Copilot chatbot to provide a level of privacy and teachers can use AI detectors to check assignments for AI-generated text. These detectors are not 100 per cent reliable, Dennis says, and a teacher’s understanding of a student’s voice often provides better reading of a work’s integrity.

Many Australian schools are concerned about the potential of generative AI and they have taken a protective posture towards it, Dennis says, noting that looming questions of privacy, security and assessments continue to cast a shadow over AI’s teaching and learning potential.
“We warn staff and students not to enter personal or identifying information on artificial intelligence platforms,” she says.

Haileybury has generative AI guard rails, she adds. Students are informed that academic integrity is important and there must be honesty and transparency about where ideas come from. Critical thinking and ethics must come into play, and students must look for bias, question facts, cite sources, and use research skills to validate data.

The Australian