Will universities survive chatgpt?
It was September 2023 when lawyer Rhys Palmer received the first phone call from a university student, alarmed, accused of using a chatbot with artificial intelligence to copy his duties. It was the first of many others.
Less than a year ago, Openai’s chatgt had been launched and students were already using it to summarize scientific articles and books. But also to write essays.
« Since the first call, I immediately thought this would be a big issue, » Palmer says from his office at Robertsons Soliciors in Cardiff, where he is specialized in the right to education. « For that particular student, the university had used artificial intelligence itself to detect plagiarism. I saw the possible flaws in that process and predicted that a wave of students would face similar issues. »
While the use of chatbot is becoming increasingly widespread, what was originally a slight flow of problems has turned into a deep challenge for universities. If a large number of students are using chatbot to write, sought, programmed and thought about them, what is the purpose of traditional education?
Since that first call in 2023, Palmer has created a special specialization helping students accused of using him to copy in course tasks or online exams. He says most students have been acquitted after presenting evidence such as essays written in advance, notes for preparation and previous work.
« ‘Copying with Him’ is … a matter of itself, » Palmer says. « Often it is parents who call on behalf of their children. They often feel that their children have not received clear guidance or training on how or cannot use it. »
In other cases, Palmer has helped students who accept the use of him to avoid punishment by arguing that university policies on him were not clear, or that they had mental health problems such as depression or anxiety.
« They come saying, ‘I am wrong,' » he says. « In such cases, we receive a letter from the family doctor or a report from a expert confirming that their judgment was injured. »
‘Was really well formulated’
Some students report that Chatgpt is now the most common program that appears on student laptops in university libraries. For many of them, it is already an essential part of everyday life.
Gaspard Rouffin, 19, a third year student for history and German at Oxford University, uses it every day for everything, from finding suggestions for books to summarizing long articles to find out if it is worth reading them full. For language modules, using it is definitely more controversial.
« I had a lecturer in my second year, in a translation class (in German), and she noticed that many translations were generated by him, so she refused to appreciate the translations that week and told us to never do it, » he says.
Other lecturers have been less vigilant. Another third -year student in Oxford remembers a tutorial where a co -called was reading an essay she felt it was clearly created by him. « I realized immediately, » she says. « There was something in the syntax, the way she was built and the way she was reading it. »
The lecturer’s reaction? « Said ‘Wow, it was a truly excellent entry, it was very well formulated and I liked its accuracy.’ I was just staying there, thinking ‘How do you not realize that this is the product of chatgt?’ I think it shows a lack of knowledge on this issue. «
A study by the Yugo Student Accommodation Company shows that 43 percent of UK students are using it to correct academic work, 33 percent use it to help with essay structure and 31 percent to simplify information. Only 2 percent of 2,255 students said they use it to copy tasks.
However, not everyone has a positive attitude towards this software. Claudia, 20, who studies « health, environment and societies », sometimes feels disadvantaged. She says: « Sometimes I feel disappointed, such as in modern language modules, when I know for sure I wrote everything from scratch and worked hard to accomplish it, and then hear about others who have simply used chatgt and finally get much higher grades. »
Students are also afraid of the consequences if the chatbot disappoints them. « I’m afraid to make a mistake and plagiarism, » says Eva, 20, who studies « health and environment » at the University College London. Instead, she introduces her recurrence notes on the chatgpt and asks her to ask questions to test her knowledge.
« Of course it’s a little annoying, when you hear, ‘Ah, I got this grade,' » she says. « And you think, ‘But you used the chat to get it.’ (But) If others want to use it now and then know nothing of the subject, it is their problem. «
How should universities respond?
Universities, somewhat late, are trying to design new ethical codes and explain how it can be used depending on the course, module or type of evaluation.
Approaches change significantly. Many universities allow the use of it for search purposes or for spelling and grammar assistance, but others completely stop it. Punations for violating the rules range from written warnings to the exception.
« Universities are in the position of someone who tries to close the barn door after the horse has fled, » says an auxiliary lecturer at a university of the Russell group. « Our university is only responding to it by setting policies, such as which assessments do not allow its use. »
There are some obvious signs of writing created by robots that the professor notes: « If I see a very busy work or many surnames, I begin to doubt. »
Some students are definitely caught. Data obtained from Times Higher Education show that cases of improper behavior regarding the universities of the Russell group are increasing as it is becoming common. At the University of Sheffield, for example, there were 92 cases suspected of violating him in the Academic Year 2023–24, where 79 students were punished. This is compared to only 6 suspected cases and 6 penalties in 2022–23.
But Palmer says many universities have become highly dependent on the software of him like the tournament, which compares the work of students with billions of websites to detect possible plagiarism. The tournament gives a « point for similarities » – the percentage of the text that matches other sources. The company itself says the results should not be used isolated.
« The probation bar falls on the university to decide whether, according to, the student has copied, but this burden is quite low compared to a criminal case, » Palmer says.
Medical and biomedical students seem to be the most vulnerable to the accusations, according to Palmer, because they often memorize technical definitions and medical terminology directly from him.
Andrew Stanford’s case illustrates how complex this problem can be. In 2023, Stanford, 57, was charged by the University of Bath of using him in copying during the first year exams. He had paid £ 15,000 to enroll in an MSC course in applied economy (banking and financial markets) from his home in Thailand. However, nine months after the diploma began, he was accused of using him to formulate the answers to the exam. It is not clear how the university came to this conclusion.
Stanford insists that the paragraphs in question were his personal work. He searched for well -known artificial intelligence applications to see if there was any formulation similar to his, but he found nothing, he says. However, two months later, in November 2023, he was told that he was found guilty and would be removed 10 percent from the rating grades.
« I was not disturbed by the 10 per cent deduction, but I was concerned that I would remain with a note of improper academic behavior, » says Stanford, who teaching mathematics and economics in Thailand. « It was extremely depressing to me when I had done nothing wrong. He felt like an unfair judgment. »
Stanford, who will complete his master’s degree for two and a half years in Bath later this year, brought his complaint to the Independent Arbitration Office for Higher Education. This month he was acquitted/ innocent/
« Students will need these skills »
Risk for students – and universities – is great. « Universities are facing an existential crisis, » says Sir Anthony Seldon, a former rector of the University of Buckingham. But if the challenges are overcome, this can be an opportunity.
« It can be the best thing that has ever happened to education, but only if we precede the shortcomings it brings. And the biggest problem at the moment is fraud, » he says. « The generator is getting better than the software used to detect it. It can be customized in the style of the user, which means that even a very good teacher will find it difficult to distinguish it.
Some academics have proposed the return of exams with supervision and submitting handwriting as a solution. But Seldon does not think that the solution is to move from exam tasks because they do not teach students great skills for life. Instead, he says there should be more focus on seminars where students are encouraged to « think critically and cooperatively ».
Professor Paul Bradshaw, lecturer of data journalism at Birmingham City University, says he is a « mass problem » for lecturers who have traditionally based students’ ability to absorb the text and draw their conclusions.
However, he thinks it is essential that universities teach students how to use it critically, understanding its benefits and shortcomings, rather than stopping it.
« You have a group of students who don’t want to know about him at all, » he says. « The problem for them is that they will come out in a job market where these skills will need them. Then you have another group they use, but they don’t tell anyone and they don’t really know what they are doing. »
« I think we are in a very unpleasant situation where we are having to adapt to walking, and you will see many mistakes – from students, lecturers and technology companies. He has the potential or move education forward or destroy it. »
Top Channel