Schools Care factor: teaching in the age of AI
I recently attended a conference about AI in education, and what I saw scared me. Not the AI tools themselves, but the eagerness of some teachers to let them take over all stages of our work.
I don’t even blame them. In an education system already forcing us to think and work like machines, the lure of machines that can do our thinking for us is too good to refuse. But if we’re not careful, it might spell the end of the most important thing we do.
Long before AI was mentioned outside the pages of sci-fi novels, machine thinking was taking over society. The industrial revolution didn’t just transform fields into factories, it reduced many workers to bureaucratic functions in corporate algorithms designed to return dividends to shareholders.
Now, even the labour of altruistic professions like public education has been encoded into innocent-sounding acronyms aimed at maximising efficiency. Far from saving us time, commodifying teaching has left us so overworked – implementing High Impact Teaching Strategies and documenting curriculum outlines – that we have no time left to wonder why it hasn’t worked.
And now that we’re stuck on this Best Practice hamster wheel, the time-saving promise of automation returns to save the day. No time to write that next unit of work? ChatGPT has you covered. Just type in the topic and it’ll give you a ten-week lesson plan. Need some worksheets and assessment tasks to go with it? ChatGPT will even throw in a marking guide. And when our students use ChatGPT to complete the work we give them, ChatGPT solves that problem too, helping us detect what we now label ‘cheating’.
If you care about the subject you teach, and the students learning it, then almost everything else will take care of itself.
But it’s when we complete this edu-tech Ouroboros that we go too far. You see, ChatGPT can even assess the student work too, generating personalised sounding comments for their reports. What a time saver. No need to even read their work at all. Soon a teaching degree might be nothing more than a sub-class of ‘Prompt Engineer’ and we’ll all get 60-plus students in our classes to ‘manage’. Hattie will love it! (Google John Hattie and class sizes.)
During the pandemic, I started running workshops for pre-service teachers, and they always ask me what my best bit of advice is. My answer is simple: you just have to care.
This might seem naive. But if you’ve ever had a teenager say “I don’t care” to your face, you know how powerful it can be. Students protect themselves from uncertainty with the armour of apathy – and, in an increasingly alienating world, making them care has become the biggest challenge we face. As I tell every pre-service teacher I work with, the only way to overcome this is to care yourself.
If you care about the subject you teach, and the students learning it, then almost everything else will take care of itself. You can take or leave Learning Intentions and Success Criteria; follow a student-led inquiry model or a direct instructional one. You can be funny and performative, or quiet and snarky… If they know you care, it will work. The good news is, there is no end to the ways you can show them.
It can be the time you put into your presentations, or how fast you learn their names. It can be how angry you get when bad behaviour interrupts their learning, or the feedback you give on a reworked draft. Caring is what defines our true experiential value, the paradoxical part of our labour that can’t be reduced to a time fraction, encoded into an acronym, or automated by ChatGPT.
As we speak, programmers around the world are trying to work out how to keep the next generation of AI from turning us into paperclips. They euphemistically call this the Alignment Problem, since it boils down to how to make these superhuman thinking machines align their goals with our own. But if we let corporate doublespeak obfuscate what those goals are, then there won’t be anything left worth aligning to.
So, it’s time we stop thinking like machines, and work out what we care about. Who knows, maybe then the programmers will realise that the answer to their problem has been staring them in the face all along. Who better to teach a new mind how to care than us?