Australia is “playing catch-up” on the use of a popular chatbot in the education system, federal officials have conceded at a federal inquiry.

Education officials on Wednesday faced a parliamentary hearing on the use of generative artificial intelligence as the rapid rise of the most popular tool, ChatGPT, continues to challenge standard ways of learning, teaching and grading.

Students have seized on the chatbot since it launched in November to get answers and create text and other materials, including homework.

Meanwhile teachers are using it to compile quizzes and draft lesson plans, but they have been urged to understand its limitations.

Large language models – the technology behind the tool – can struggle with Australia-specific content because they draw from a vast swathe of information on the Internet and could present biased or false answers, the inquiry was told.

“We’re really in the space of playing catch-up,” Julie Birmingham, first assistant secretary of teaching and learning, told politicians.

“I don’t think there’s much we can do about that, apart from try and catch up,” she said.

Asked if Australia should build its own version of ChatGPT for schools, she said it was an idea that had been floated.

For example, South Australia is working with Microsoft to pilot a fenced-off version for use in the state’s classrooms but schools will determine when it is suitable or safe.

The Tertiary Education Quality and Standards Agency (TEQSA) is working with Deakin University on what AI means for academic integrity to develop guidance on how assessment will change.

Acting education department assistant secretary Chris Davern said a “nuanced” approach to assessment would be needed, rather than going back to handwritten essays and verbal exams.

The Australian Catholic University, which trains many of the nation’s teachers, said generative AI will not replace a student’s deep understanding of a field of study, or how to nuance a lesson for a classroom.

But it would assist their comprehension of academic content and provide access to a broader range of ideas, ACU said.

“The challenge is finding the right balance between the use and abuse of AI in student work,” its submission said.

New short courses or micro-credentials could explain what educators at all levels need to know about the risks and benefits, but funding remains unresolved.

“There’s already a hunger for information – principals are saying make this easy for us,” Ms Birmingham said.

But she warned the “digital divide” between schools who have high-speed internet and the latest equipment, and those missing out, will always be a challenge in remote areas.

Australia is also involved in global discussions and has been showcased by the international Organisation for Economic Cooperation and Development as a leader in policy development in this area.

The organisation said in a submission it is still unclear what roles generative AI will play – inside and outside of the classroom.

OECD project lead Stephan Vincent-Lancrin said adaptation must also include other smart technologies, including social robots, for personalised learning, improving student engagement and technologies to suit special needs students.

But these technologies still require a “human-in-the-loop” rather than replacing teachers with a digital version, he said.

 

Marion Rae
(Australian Associated Press)