
For everyone Learning to love AI

Some say Socrates was a smart guy, but when writing was invented he thought it would destroy his students’ ability to think. This fear is being repeated in the response to the release of ChatGPT, currently barred in schools – ostensibly because its terms of use required users to be 18+ years old – while the Department of Education undertakes further investigation. No doubt many will be arguing for an all-out ban, concerned that these new forms of AI are the nail in the coffin for critical and creative thinking. But, before we carry our tasers to the Town Hall and take down Skynet, we need to consider what paperclips can teach us.
Firstly, the ‘paperclip test’. In the 1960s, NASA scientists were asked to come up with as many uses for paperclips as they could imagine. It’s a test we still use today, and while most of us average about 20, the geniuses amongst us can apparently come up with as many as 200 uses. But if one of those was to replace all the fuses in their house with paperclips, then valuing the quantity of someone’s ideas doesn’t seem as important as their quality.
As a ‘brainstorming’ exercise, I make my senior visual art students use tools like the semiotic square, which forces them to dig down into each of their concepts to uncover the complex web of symbols and personal significance they contain. This is what makes their artwork meaningful, and it’s the first and most important step in the creative process. It’s also one that no AI could ever do for them.
But, when it comes to visualising their ideas, AI can be more powerful than a pencil. Instead of being limited by lines to give life to their concepts, my art students already use programs like Dall-e and Midjourney to visualise possibilities, which they then critique, share, and reflect on to develop their work.
So, are we losing the very thing that makes us human?
While I understand concerns that these programs access copyrighted material to generate finished artworks, as long as my students document their use as one aspect of their development and feedback, then using them is no more cheating than using photos as a reference for paintings.
Of course, AI can already do much more than help my Art students visualise, or my Media students storyboard their short films. Programs like ChatGPT can even write their screenplays or draft their analytical essays. So, are we losing the very thing that makes us human?
This brings me to the second lesson, care of paperclips: a thought experiment devised by Nick Bostrom in 2003, innocently referred to as the ‘Paperclip Maximiser’. As AIs only do what they are programmed to do, without proper guidance even a benign command might lead to disaster. If, for instance, we set up an AI to run a paperclip factory, what might begin as an efficient production manager, could inadvertently turn into a Kafkaesque death-machine, turning all of the world’s resources – including us – into paperclips.
The AEU already has a ‘Technology and Teaching Policy’ that addresses this point. Not the paperclip death-machine exactly, but the need to monitor the use of AI to make sure the tasks we set it align with the needs of real human beings through consultation with real teachers.
The problem is, we can’t possibly influence the implementation of AI if it is banned from the very places we would use it. We can’t stop what’s coming any more than Pandora could get whatever she let out back into that box of hers.
Unless we ban electricity in schools, there is no way to keep AI out of our classrooms.
Which brings us to our last allegorical paperclip, Clippy. Anyone under 30 likely never met the animated mansplaining Microsoft Office assistant, but he is about to make a comeback – and this time he’s bringing friends. By the end of the year, AI programs will be integrated into every program, search engine and device we use. Unless we ban electricity in schools (think of the savings!) there is no way to keep AI out of our classrooms. So, instead of worrying, I suggest we learn to love the way it can work for us.
Teachers are already finding AI useful in clarifying and simplifying complex topics or translating course work into unit outlines and lesson plans. Others have used ChatGPT not only to generate model answers to questions, but to critically analyse them for facts and bias.
There will be disruption. But, who knows – all this technology might even bring us full circle, back to an education system centred on oral discourse and collaborative problem-solving. Socrates would appreciate the irony.