Opinion
AI belongs in classrooms

AI in education has transformative potential for students, teachers and schools but only if we harness it in the right way – by keeping people at the heart of the technology, says Jill Duffy.
When you think about AI and education, the first thing that comes to mind is probably students using ChatGPT to write their essays and coursework. But, important as this issue is, the debate about AI in education should go way beyond it.
As head of an exam board (OCR), I am well aware of how serious this issue is. Deciphering whether a piece of work was AI generated was not part of the job description for educators a decade ago, and I’m sure not many appreciate this new addition to their workload.
ChatGPT writing essays may be the most noticeable phenomenon right now, but it is far from the only way that this technology will transform how we teach and assess young people. Crucially, AI offers opportunities as well as threats. But only if we harness it in the right way – by keeping people at the heart of education.
What does that mean in practice? Let’s look again at the concerns over AI and coursework. As I’ve previously argued, we cannot put generative AI back in its box. Demanding that students never use it in any capacity is obviously not enforceable, and I would also argue is not desirable: the proper use of this technology will be a vital skill in their working lives.
In future, instead of asking students “did you use AI?” teachers will be asking them “how did you use AI?” It’s about accepting where this technology can help students – finessing arguments, helping with research – while protecting the human skills they will still need – fact checking, rewriting, thinking analytically.
The same human-centric approach is needed when it comes to teaching and AI. We can’t afford to ignore the obvious benefits of this technology, but we cannot embrace it blindly at the cost of real, human teaching. At OCR we are looking into various tools that could help teachers who are struggling with ever-increasing workloads. This could be about helping them with lesson planning, or searching through subject specifications or guidance materials.
So, we don’t expect AI to replace the very human skills of intelligently questioning a student to guide their learning, or safeguarding their wellbeing, or passing on a passion for their subject. Instead, AI can take care of some of the time-consuming admin, giving teachers more time to actually teach.
This human centered approach guides everything we are doing at Cambridge and OCR. We have been developing digital exams for the past few years, for Cambridge’s international exams and for OCR’s popular Computer Science GCSE. What we are not doing here is simply transferring the paper exam on to a screen. We have been testing and monitoring how students perform in these on-screen exams, using mocks and trials, to make sure there is no advantage or disadvantage to a particular method.
Achieving its potential
But keeping humans at the heart of education while getting the most out of new technology will take more than the efforts of one exam board.
As OCR recently warned in its report Striking the Balance, there is a risk that the move towards digital exacerbates existing inequalities in the system. If digital learning can be more effective, what happens to schools that can’t afford the required technology?
A national strategy is required – involving the government, regulators, and other stakeholders – to ensure every school can benefit from the transformative potential of this technology.
Jill Duffy leads OCR and is managing director for UK Education at Cambridge University Press and Assessment.
Published: 4 April 2025
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License
