I’ve just switched to an electric car – well, about three weeks ago now. It’s not particularly exciting or flash; it’s not a brand new Tesla. It’s a four-year-old Vauxhall Corsa E with 50,000 miles on the clock that actually cost less than its internal combustion engine equivalent.
What’s fascinating about this car is how it just slides into your life. Apart from the green stripe on the number plate and a small ‘e’ badge, you wouldn’t know it’s electric. It looks identical to its petrol counterpart – same interior, same functionality, all the modern conveniences like autonomous emergency braking, parking sensors, heated seats, and climate control.
This car handles all our micro-journeys perfectly – the 1-5 mile trips that account for most people’s driving. A 12-mile round trip to the gym, a quick dash to Sainsbury’s when the weather’s poor, all those little local journeys that sit well within even a modest electric car’s range. We charge it two or three times a week, and while the WLTP rating claims 209 miles, realistically we see 170-180 miles in good weather, dropping to around 140 in winter.
Here’s the thing though – while this car looks identical to its petrol sibling, it behaves completely differently. The pickup from 0-30mph is extraordinary; it outperforms any internal combustion engine vehicle. But this means you have to think differently about driving. If I put my foot down the way I might in a petrol car, I’ll hit the speed limit and exceed it before I realise what’s happening. The weight sits low in the chassis, so it corners differently. It amplifies both your skills and your errors.
You can’t behave in an electric car the way you would in a petrol vehicle. It just doesn’t work.
Students Are Already Using AI – Without Our Guidance
This is what’s happening with AI in education. AI has slid into their lives and is integrating seamlessly – Students are already using AI – not in Google Classroom – most schools are still debating how or if they should engage with AI- but at home, unsupervised, for homework, research, and assignments. While we debate if we should allow AI in schools, our students have moved far beyond that conversation. They’re already in the driver’s seat of powerful technology we haven’t taught them to use safely.
And here’s the crucial reality: we never really taught them how to navigate digital tools properly in the first place – we just left them to figure it out for themselves.
It’s kind of like just letting new drivers into the seats of the latest generation of electric cars without any real driving instruction. Electric cars are easier to use – just put it in drive and press the pedal and away you go – but you’re still driving in the world with all the attendant hazards, only now in something blisteringly fast and much more complicated.
This is what AI is to traditional tech – the force multiplier, the super turbo boost. But it has the potential to end up in a complete car crash.
The Academic Integrity Challenge
AI forces us to completely rethink academic integrity. Traditional assessments are increasingly difficult to proof against AI assistance, and we face an impossible choice: ban it and risk driving usage underground, or ignore it and leave everything vulnerable to misuse.
We need clear AI codes that acknowledge reality rather than pretending we can control what happens beyond our classroom walls. This means designing tasks that work with AI rather than against it, and developing new ways to assess not just what students produce, but how they use AI to get there. And let’s face it, not a moment too soon – we should have been assessing process long ago.
The Assessment Design Dilemma
Like my electric car that looks identical to its petrol counterpart but behaves completely differently, AI-integrated learning appears familiar while fundamentally changing how everything works. We need to blend digital AI-assisted tasks with analogue assessments, evaluating both student thinking, creativity, team work and their AI collaboration skills.
This isn’t just about catching cheating – it’s about teaching students when AI enhances learning versus when it becomes a dangerous shortcut.
The Learning Process at Risk
Students who view AI purely as a shortcut rather than a learning tool undermine the valuable friction that builds understanding. Some struggle is essential for deep learning – we cannot smooth out all cognitive load without losing the process that creates genuine knowledge.
Worse, unsupervised AI use develops dangerous habits. Students experimenting with deepfakes, consuming AI-generated misinformation, or losing the ability to distinguish between their thinking and AI output face risks we’re only beginning to understand.
What This Means for Tomorrow’s Classroom
Just as I had to learn different driving behaviours for my electric car despite its familiar appearance, we need to teach students different learning behaviours for an AI-integrated world. This means:
- Explicit AI literacy curricula that address both capabilities and risks
- Assessment designs that assume AI access rather than trying to prevent it
- Clear policies that guide rather than prohibit AI use
- Teaching verification skills for an AI-saturated information landscape
The Choice Is Ours
I love my electric car and I’m never going back, but I recognise I needed to think about it differently from any other car I’ve owned. Similarly, AI technologies in education require us to approach teaching and learning very differently from any classroom tool we’ve used before.
The technology is already here. Our students are already using it. The question isn’t whether to allow AI in education – it’s whether we’ll teach them to use it wisely before they develop habits that undermine both their learning and their digital safety.
We can continue teaching as if AI doesn’t exist while our students navigate this landscape alone, or we can step up and provide the guidance they desperately need. But we need to act now – every day we delay is another day our students develop AI habits without our input.
