Researchers have identified tell-tale signs that students have used AI to help write their essays.
Excessive used of words derived from Latin, using unnecessary words and repeated use of the Oxford comma are among the hallmarks of using a generative chatbot to complete coursework, researchers found.
But while students taking part in the trial said they found using AI had some advantages, they acknowledged that relying on it completely would likely result in work of a low standard.
The impact of generative AI on education has been exercising educators since Open AI launched ChatGPT — a chatbot that generates text by predicting which words are likely to follow a particular prompt — in November 2022.
While some regard AI as a potentially transformative technology, creating a more inclusive and personalized education, for others it makes it impossible to trust coursework grades. Even academics have not been immune to using AI to enhance their work.
Now researchers at Cambridge University have tried to see if they could identify characteristics of AI’s writing style that could make it easy to spot.
And although their trial was small scale, they say it has the potential to help teachers work out which students used AI in their essays, and which did not.
Three undergraduates were enlisted to write two essays each with the help of ChatGPT, which were then compared with essays on the same topic written by 164 high school students. The undergraduates were then interviewed about their experience of using AI.
(Undergraduates were included in the study because ChatGPT requires users to be 18 or over).
The ChatGPT essays performed better on average, being marked particularly highly for ‘information’ and ‘reflection’. They did poorly, however, for ‘analysis’ and ‘comparison’ — differences the researchers suggest reflect the chatbot’s strengths and weaknesses.
But when it comes to style, there were a number of features that made the ChatGPT assisted version easily recognizable.
The default AI style “echoes the bland, clipped, and objective style that characterizes much generic journalistic writing found on the internet,” according to the researchers, who identified a number of key features of ChatGPT content:
Although the students taking part in the trial used ChatGPT to different extents, from copying and pasting whole passages to using it as prompts for further research, there was broad agreement that it was useful for gathering information quickly, and that it could be integrated into essay writing through specific prompts, on topics and essay structure, for example.
But the students also agreed that using AI to write the essay would produce work of a low academic standard.
“Despite the small sample size, we are excited about these findings as they have the capacity to inform the work of teachers as well as students,” said Jude Brady of Cambridge University Press and Assessment, lead researcher on the study.
Future work should include larger and more representative sample sizes of students, she said. Learning to use and detect generative AI was an increasingly important part of digital literacy, she added.
“We hope our research might help people to identify when a piece of text has been written by ChatGPT,” she said.