22-year-old David* is doing very well at university. His assignment marks are consistently “in the high 60s or low 70s”, meaning he’s currently on track to receive a merit overall. In his most recent assignment, he received an impressively high mark of 81 – “even though I blatantly used AI,” he says.
David, a postgraduate student at the University of Leeds, regularly uses AI bots such as Grok and Gemini to help him with his coursework. Sometimes he uses them to help him with planning and structuring his essays. “Once, I uploaded the assessment brief and relevant lectures, then had [Gemini] make me an essay structure which linked the question back to the brief and lectures,” he says.
He has also used AI bots to reword swathes of his work. “I would write out a paragraph and then ask the bot to make me sound more coherent and add any other relevant points in.” David would then tweak the bot’s response to make the result sound passably human. He has no qualms about using AI in this way. “It’s just way more effective time-wise,” he says. “I can do more research and make better quality work in my view.”
It’s just way more effective time-wise. I can do more research and make better quality work in my view
Interest in AI’s capabilities has exploded since OpenAI launched ChatGPT in 2022. Fast forward to today, and people routinely use the bot to do everything from draft break-up texts, whip up dinner recipes, and help with university work: according to a 2024 survey by the Higher Education Policy Institute, more than half of students now use generative AI to help with their assessments. In light of this, questions surrounding AI’s place in higher education are becoming more urgent: should students get on board with using chatbots to get ahead and work more efficiently? Or is using generative AI eroding critical thinking skills in the long run?
While David sees no issue with using AI to help him complete his university work, 21-year-old Yazzy, a law student at Bangor University, is more sceptical. “I think that the increasing use of generative AI to complete uni work is quite scary,” she says, explaining that she feels as though it creates “inequity” among students. “It’s my personal choice not to use AI, because I’d like to get the most out of my course,” she adds. “But I know for a fact there are a few people on my course who use it.”
Still, Yazzy stresses she has “never been tempted” to use generative AI to help with her coursework. “If you’re going to do an expensive and time-consuming course, you may as well gain as much knowledge and experience about your chosen discipline, because otherwise, what’s the point?” she says. She adds that she believes the job market will soon become crowded with graduates sorely lacking in skills. “Being able to use AI isn’t a difficult skill. Anyone can do it. But managing a workload, producing work to a high standard, being able to research – these are difficult skills that will not be developed if you depend on AI all the time.”
The science appears to be on Yazzy’s side, with new research suggesting that using ChatGPT can damage critical thinking abilities. As part of the study, researchers at MIT divided 54 subjects (aged between 18 and 39) into three groups and asked them to write several SAT essays using ChatGPT, Google’s search engine, and nothing at all, respectively. They found that the group which used ChatGPT demonstrated the lowest brain engagement and “consistently underperformed at neural, linguistic and behavioural levels.”
Dr Nataliya Kosmyna, the paper’s main author, tells Dazed that longitudinal studies are needed before drawing any sweeping conclusions about the long-term impact of ChatGPT on our brains. But she says it’s clear using bots like ChatGPT really can hamper the development of critical thinking, memory and language skills. “Writing is often associated with critical thinking, among other skills. So if that is being outsourced, you won’t retain much of the skills [learnt from writing],” she explains. While it’s true using ChatGPT is perhaps more ‘efficient’ – in that it enables students to work less – she stresses that students who overrely on generative AI will not learn and retain any meaningful skills. “It’s a waste of time. Why go to college?”
To some extent, the horse has already bolted. Many universities, instead of banning students from using generative AI outright, have implemented policies designed to help students use AI “appropriately”; notably, in 2023 all 24 Russell Group universities drew up and agreed to guidelines designed “to capitalise on the opportunities of AI while simultaneously protecting academic rigour and integrity in higher education.” At the University of Leeds, where David is a student, generative AI can be used to “help you learn”, but cannot be used to “generate or falsify work”.
Managing a workload, producing work to a high standard, being able to research – these are difficult skills that will not be developed if you depend on AI all the time
David has never been particularly worried about being caught – “I feel the way I did it was quite safe” – but not everyone has been so lucky: a Guardian investigation published in June found that over 7,000 students were caught cheating by using ChatGPT and other AI tools in the 2023-2024 academic year, with figures up to May 2025 suggesting that this number is set to rise for the 2024-2025 academic year (disturbingly, some students have even reported being falsely accused of cheating using AI for reasons as arbitrary as using the em dash, using phrases like “in contrast”, or simply because Turnitin erroneously flagged their essay as containing AI-generated content).
But it’s ultimately little wonder that students are turning to AI to be more ‘efficient’ when higher education has become increasingly commodified in recent years. “[F]or many, a degree feels like a route to a career rather than an opportunity to learn [...] before we lament a situation in which thousands of students waste their time and opportunities by plagiarising rather than actually learning, we might want to ask how we got into this position in the first place,” Poppy Noor wrote in the Guardian in 2017, pointing to the government’s introduction (and subsequent raising) of tuition fees as a catalyst for this new culture where education has become more of a product than a public good.
While Noor was writing nearly ten years ago, higher education is still facing similar problems, now exacerbated by the rise of AI. With this in mind, perhaps the proliferation of ChatGPT-generated essays is less a symptom of students becoming more lazy and more a signifier of just how cheapened university degrees have become.
*Name has been changed