How to spot when your learners use AI tools and what to do about it in your classroom

Since the advent of AI tools like ChatGPT, students at school and university have adopted them quickly. A survey in May 2025 found that 35% of UK students use AI for school learning. Among almost 6000 students aged 10–16 across the UK and five other European countries, around a fifth of the students used AI to pass exams and 16% to write essays.
AI use among university students has risen even faster. A Higher Education Policy Institute (HEPI) report in February 2025 showed that 92% of students use AI in some form, up from 66% in 2024. Almost 90% of 1041 undergraduates surveyed used AI for assessments, up from 53%. Students said AI helped them explain concepts, summarise articles and generate ideas. 18% included AI-generated text directly in their work. Nearly half had already used AI during school.
Ken Hyland from the University of East Anglia’s School of Education and Lifelong Learning acknowledges the anxiety AI tools cause teachers. ‘The fear is that ChatGPT and other AI writing tools potentially facilitate cheating and may weaken core literacy and critical thinking skills.’
Comparing AI-generated and students’ work
In response to these concerns, Ken’s team set out to investigate how well AI can mimic student essays. They compared 145 essays written by university students with 145 generated by ChatGPT.
Since the advent of AI tools like ChatGPT, students at school and university have adopted them quickly. A survey in May 2025 found that 35% of UK students use AI for school learning (bit.ly/42HR2xq). Among almost 6000 students aged 10–16 across the UK and five other European countries, around a fifth of the students used AI to pass exams and 16% to write essays.
AI use among university students has risen even faster. A Higher Education Policy Institute (HEPI) report in February 2025 showed that 92% of students use AI in some form, up from 66% in 2024 (bit.ly/4pGROF6). Almost 90% of 1041 undergraduates surveyed used AI for assessments, up from 53%. Students said AI helped them explain concepts, summarise articles and generate ideas. 18% included AI-generated text directly in their work. Nearly half had already used AI during school.
Ken Hyland from the University of East Anglia’s School of Education and Lifelong Learning acknowledges the anxiety AI tools cause teachers. ‘The fear is that ChatGPT and other AI writing tools potentially facilitate cheating and may weaken core literacy and critical thinking skills.’
Prose without a personal touch
In response to these concerns, Ken’s team set out to investigate how well AI can mimic student essays (bit.ly/4nACyrX). They compared 145 essays written by university students with 145 generated by ChatGPT.
‘We were particularly interested in looking at engagement markers like questions and personal commentary,’ Ken explains. ‘We found that the essays written by real students consistently featured a rich array of engagement strategies, making them more interactive and persuasive. They were full of rhetorical questions, personal asides and direct appeals to the reader – all techniques that enhance clarity, connection and produce a strong argument.’
AI essays, by contrast, remained coherent and grammatically correct but impersonal. ‘The AI essays mimicked academic writing conventions, but they [the tools] were unable to inject text with a personal touch or to demonstrate a clear stance. They tended to avoid questions and limited personal commentary. Overall, they were less engaging, less persuasive and there was no strong perspective on a topic. This reflects the nature of the AI’s training data and statistical learning methods, which prioritise coherence over conversational nuance.’
‘We’re not just teaching them how to write, we’re teaching them how to think – and that’s something no algorithm can replicate’
Despite these shortcomings, the researchers argue against banning AI. They recommend treating it as a teaching aid rather than a shortcut.
‘When students come to school, college or university, we’re not just teaching them how to write, we’re teaching them how to think – and that’s something no algorithm can replicate,’ Ken adds.
Spotting AI use, watch out for:
- unusually sophisticated language and stock phrases such as ‘it is important to note’
- a lack of personal voice, variation or colloquial phrasing
- factual errors, misrepresented data or outdated information
- repeated words or ideas
- consistently perfect grammar and punctuation.
Niall Begley, head of junior science and chemistry teacher at St Patrick’s College, Knock, found the survey results ‘eye-opening’, especially the number of students using AI as a shortcut rather than a support. ‘While I understand the anxieties many teachers have about its use, my own experience engaging with AI directly – testing out tools and exploring the possibilities – helps reduce that anxiety.’
He says the findings have important implications for secondary teachers. ‘They remind us that AI is here to stay, so rather than resisting it we should guide students in responsible use. This includes setting tasks that go beyond what AI can easily generate, such as practical investigations, critical analysis and personal reflection. AI can be a valuable classroom tool, but it should never replace the role of teachers in developing students’ independent thinking and creativity.’
Designing courses to catch ChatGPT out
Researchers at the University of Illinois Urbana-Champaign (UIUC) took a different approach. They tested how well the free version of ChatGPT compared with undergraduates taking an aerospace engineering module. ChatGPT scored highly on structured maths questions, earning an A. On open-ended questions, it achieved 62%, bringing its overall mark down to 82% – a low B. The class average for the students was 84.85%.
Calculating AI’s worth
Researchers at the University of Illinois Urbana-Champaign (UIUC) took a different approach. They tested how well the free version of ChatGPT compared with undergraduates taking an aerospace engineering module (bit.ly/48rqswf). ChatGPT scored highly on structured maths questions, earning an A. On open-ended questions, it achieved 62%, bringing its overall mark down to 82% – a low B. The class average for the students was 84.85%.
‘Like calculators in math classes, ChatGPT is a tool that’s here to stay and that students will use,’ says UIUC’s Melkior Ornik. As educators we must adapt. Melkior adds, ‘I plan to consider how I design my courses so that I include more higher-level questions, perhaps including project-based assignments. By adding more open-ended questions, students will also reach a higher level of critical thinking and truly learn the material.’
Teachers need to understand why students are drawn to AI – lack of self-confidence or skills, for example – and show them how to use it as a tool
ChatGPT also made mistakes. It sometimes introduced jargon not covered in the course, such as ‘quasi periodic oscillations’ and produced incorrect statements, even though the researchers provided it with the course material. Melkior notes, however, that premium versions of ChatGPT may handle more complex analytical problems.
Sue Ali, a chemistry teacher at Ashville College in Harrogate, sees value in AI but questions whether teachers have enough training to help students use it safely and with integrity. She believes teachers need to understand why students are drawn to AI – lack of self-confidence or skills, for example – and show them how to use it as a tool. ‘For example, using AI to create different examples of exam questions [and answers] for students to mark using a scheme. This shows them AI may not always be correct. For assessments and homework, teachers need to consider how to reduce cheating. This would mean more work to update schemes of work and teaching resources. I wonder how this would impact online assessments in schools, and IT teams will need to ensure there are detection programs to stop AI use during assessments. Schools need to share these findings with students, to help them see AI as the tool and not the easier option.’
Test the student, not AI:
- Set tasks requiring critical thinking, analysis and evidence-based arguments.
- Use practical activities such as plotting and interpreting data.
- Ask students to apply concepts to new situations or reflect on their own experiment results.
- Incorporate oral exams, presentations and group projects.
With AI becoming more powerful and students more skilled in its use, what can teachers do? The HEPI report concludes that punishing students makes little sense when AI use is inevitable and often helpful. Instead, schools and universities should develop clear policies, provide teacher training and guide students on responsible use.
Nora Richardson, chemistry lecturer at Cambria College in Wales, agrees the teacher’s role must adapt. ‘Training is key so we can give students clear guidance on responsible AI use at a pivotal stage in their education. I feel the challenge of authentication and assessment is greater in qualifications with high coursework content. Specific AI guidance from awarding organisations can help shape our practice and policies.’
Maria Burke is a freelance science and business journalist
’I found the survey results eye-opening, especially the number of students using AI as a shortcut rather than a support. While I understand the anxieties many teachers have about its use, my own experience engaging with AI directly – testing out tools and exploring the possibilities – helps reduce that anxiety. They [the findings] remind us that AI is here to stay, so rather than resisting it we should guide students in responsible use. This includes setting tasks that go beyond what AI can easily generate, such as practical investigations, critical analysis and personal reflection. AI can be a valuable classroom tool, but it should never replace the role of teachers in developing students’ independent thinking and creativity.’
Niall Begley, head of junior science and chemistry teacher at St Patrick’s College, Knock
’I see value in AI but question whether teachers have enough training to help students use it safely and with integrity. We need to understand why students are drawn to AI – lack of self-confidence or skills, for example – and show them how to use it as a tool. For example, using AI to create different examples of exam questions [and answers] for students to mark using a scheme. This shows them AI may not always be correct. For assessments and homework, teachers need to consider how to reduce cheating. This would mean more work to update schemes of work and teaching resources. I wonder how this would impact online assessments in schools, and IT teams will need to ensure there are detection programs to stop AI use during assessments. Schools need to share these findings with students, to help them see AI as the tool and not the easier option.’
Sue Ali, a chemistry teacher at Ashville College in Harrogate
’I agree the teacher’s role must adapt. Training is key so we can give students clear guidance on responsible AI use at a pivotal stage in their education. I feel the challenge of authentication and assessment is greater in qualifications with high coursework content. Specific AI guidance from awarding organisations can help shape our practice and policies.’
Nora Richardson, chemistry lecturer at Cambria College in Wales








No comments yet