For everyone It’s a kind of magic: AI and the future of education

  • By Jeff Sparrow
  • This article was published more than 1 year ago.
  • 5 Apr 2023

Artificial intelligence is raising key questions about the very nature and purpose of education.

Last November, OpenAI released the language model ChatGPT. If you haven’t played with it, you really should, if only to grasp what’s coming round the corner. ChatGPT can write you a haiku about your pet. It can generate a seven-day high-protein meal plan and then convert the recipes into a convenient shopping list.

When a user commands, “Write a biblical verse in the style of the King James bible explaining how to remove a peanut butter sandwich from a VCR”, ChatGPT spits out scriptural paragraphs beginning: “And it came to pass that a man was troubled by a peanut butter sandwich, for it had been placed within his VCR, and he knew not how to remove it. And he cried out to the Lord, saying, ‘Oh, Lord, how can I remove this sandwich from the VCR, for it is stuck fast and will not budge?’”

All good fun – except, perhaps, for teachers, who must now confront the reality that AI can complete, more or less convincingly, many of the tasks by which they currently assess students. In the Atlantic, the American school teacher Daniel Herman describes how he prompted ChatGPT with: “Explain the Madhyamaka Buddhist idea that all phenomena lack inherent existence, using a funny, specific example” – and duly received a convincing philosophical essay centred on a hamster.

He also fed the AI a student draft and instructed it to “fix this essay up and make it better”. “It kept the student’s words intact,” he writes, “but employed them more gracefully; it removed the clutter, so the ideas were able to shine through. It was like magic.”

Of course, AI isn’t magic: the output from ChatGPT often contains falsehoods and glaring non-sequiturs. But it still might be a tempting option for a student seeking merely to pass, particularly as the technology improves. Google is currently trialling its own AI, and a new version of ChatGPT comes online next year. 

On that basis, many are warning about an epidemic of cheating. “At the moment,” says Lillian Edwards from the UK’s Newcastle University, “it’s looking a lot like the end of essays as an assignment for education.” Yet, that’s not exactly true. Almost certainly, in the not-too-distant future, most people will use AI to help with day-to-day writing tasks, rather like they already depend on AI-driven grammar and spell-checking software.


The growing capabilities of AI should legitimise the kind of education that only skilled teachers can provide.

In that sense, the challenges for teachers might be compared to those raised by the first hand-held calculators. Just as the new ability of students to perform sums electronically did not render maths moot, AI doesn’t supersede essays so much as put new demands on educators to decide when the technology might be appropriate and when it’s not.

After all, ChatGPT and similar systems lack any understanding of the output they produce. They draw on huge data sets to recognise patterns and then make credible extrapolations from the input fed to them. Precisely because AI doesn’t possess general intelligence, it should only ever serve as a supplement to, rather than a substitute for, human decisions.

Essays and similar tasks thus retain a value, so long as teachers have the time and resources to supervise the circumstances under which they’re written. And there’s the rub.

Experiments have shown that AI programs can mark student essays with reasonable accuracy. The astonishing advances in AI should spur a profound discussion about the philosophical underpinnings of education. If an algorithm can both complete and grade common assessment tasks, despite lacking any understanding of what it is doing, its ‘success’ does not demonstrate the redundancy of teachers so much as suggest that some traditional tasks never adequately tested human comprehension (which, by definition, AI innately lacks).

In other words, the growing capabilities of AI should legitimise the kind of education that only skilled teachers can provide – spurring, say, less reliance on tests and take-home exams, and more on the kind of human-to-human interaction possible in a small, well-resourced class.

A productive exploration of AI’s remarkable potential requires a staff fully trained on the technology and with sufficient resources to work closely with individual students as they experiment. A new White Paper from the Education Futures Studio and the Education Innovations research program describes how Automated Essay Scoring (AES) programs will likely be implemented in Australia. One of its authors, Sydney University’s Kalervo Gulson, believes that, in theory, the technology could ease the pressure teachers currently face but warns its implementation might also just mean more work.

It’s an eminently reasonable concern.

AEU policy calls for such software to be implemented under the auspices of the teaching profession itself. Teachers require comprehensive training and genuine knowledge about the workings of algorithms often shrouded in commercial confidence by the small number of corporations that market them. AI systems generally harvest data from their users, raising important questions about privacy and transparency.

In theory, the technology could provide a useful supplement in a properly resourced school – which provides yet another reason why schools must be funded at a level that supports teachers to evaluate and incorporate new tools for managing their own workload and educating their students.

However, the history of industrial innovations suggests that, in the Australian context, the ‘not-good-but-good-enough’ principle will prevail, with AI promoted as a pedagogical advance – and then deployed to cut costs. Under those circumstances, the wizardry of AI could result both in a significant decline in educational standards (as students interact more with software than a human being) and a massive increase in workloads as teachers scramble to paper over the cracks.

In that sense, the issues surrounding this futuristic technology are very old indeed. AI makes teachers more crucial than ever, but it also provides employers with a pretext to devolve certain tasks they don’t properly recognise as central to the teaching role.

The use of ChatGPT has been banned from classrooms in Victorian and other states, but AI raises questions that extend well beyond what immediate measures might be taken to govern student use of such systems. We should expect significant challenges to come.

    * mandatory fields

    Filed under

    Latest issue out now

    In the Term 2 edition of AEU News, we celebrate our members' professionalism and commitment to their students, their union, and to public education.

    View Latest Edition