University Post
University of Copenhagen
Independent of management

Student life

AI has exploded among students: University of Copenhagen to now change the rules

Artificial intelligence — Three out of four students use generative AI in their studies. But they need more instruction in the new technologies, according to a new survey from lawyers and economists union Djøf. UCPH is trying hard to keep up with the trends and will change the rules starting next year.

The use of generative AI has exploded among students:

In a new survey from the trade union Djøf, more than three out of four students (77 per cent) respond that they use AI in connection with their studies. This is a significant increase over last year’s survey, where only 54 per cent of the students answered the question in the affirmative.

75 per cent of students state at the same time that they do not think they are receiving adequate training in the use of artificial intelligence during their studies.

»As a student, I am not surprised that there has been such a large increase in the use of AI. And from our student members in Djøf we hear that they are afraid of falling behind,« says Anders Vile Michaelsen, who is chairman of the student section of the Danish Association of Lawyers and Economists (Djøf). He is himself a student of political science in Odense. He continues:

»No matter how good are you at your academic skills, AI technologies are constantly evolving, and many students are therefore nervous about what will happen if they do not learn to master the technologies.«

Many students are concerned that they are currently part of a study programme that will be made obsolete by AI in 10-15 years’ time.

Anders Vile Michaelsen, chairman of the Djøf student section

The Djøf study is based on responses from 2,227 students, 641 of whom are University of Copenhagen (UCPH) students. The UCPH students are more or less in line with the others in their responses:

70 per cent of UCPH students respond that they use AI in their study programmes in general, while 67 per cent say they use AI for their exams. A whopping 81 per cent of UCPH students respond that they do not receive sufficient support in learning how to use new AI technologies.

»This is relevant for the debate about whether it is universities’ responsibility to teach students AI. In Djøf we believe it is the universities’ task to prepare the students for the labour market they are going to enter,« says Anders Vile Michaelsen and continues:

»Employers are increasingly calling for graduate with skills in the use of AI. Many students are concerned about whether they are currently part of a study programme that will be made obsolete by AI in 10-15 years’ time. It is therefore important that universities follow the trend.«

Impossible to spot cheating

Rie Snekkerup is Deputy Director for Education at UCPH and member of the University of Copenhagen’s council on education strategy (KUUR).

If you ask her, the University is in line with current trends. UCPH has decided to reverse its current policy on the use of AI at exams from the autumn semester of 2025:

The rule has so far been that the use of AI is prohibited at exams unless explicitly stated. But from next year, the rule will be that the use of AI is always permitted — unless explicitly stated.

READ ALSO: The future is now: UCPH softens up on AI rules

»This spring we could see that we simply had to reverse our policy. AI is everywhere, and Microsoft has even incorporated it into their standard package. So now we actually have the technology in-house and can make it available to students,« says Rie Snekkerup, adding:

»So we’ve decided that we need to start working on it. And now we’ve given ourselves a year of implementation time to figure out how we can integrate this in practice in teaching and exams. Both in terms of law and of GDPR, but also so that we meet the educational requirements in the different academic settings.«

READ ALSO: New study: Students need more teaching in artificial intelligence

Some may say that we are hopelessly behind. But we need to implement it properly

Rie Snekkerup, Deputy Director for Education & Students

There have been a handful of cases of cheating using AI at exams over the past year, according to Rie Snekkerup, and she acknowledges that it is extremely difficult to check for.

»The technologies are getting better and better. And it is getting harder and harder for us to tell whether an assignment is written by a student or a machine. In the cases where it has been discovered, it has typically been because sources have been used that do not exist in reality,« says the deputy director.

After the summer holidays, UCPH appointed a working group to create a framework for the use of AI at UCPH next year. And this work is moving forward, according to Rie Snekkerup.

»Some may say that we are hopelessly behind. But we need to implement it properly and think of both teaching and exams at the same time. We have to rethink a lot of things, and this takes time,« she says.

Working on clear guidelines

Ruth Horak is senior consultant at the Education and Students unit at UCPH and is part of the working group for the implementation of AI at UCPH. Within the coming months, the group will come up with a number of inputs to qualify and set up a framework for the use of AI in both teaching and exams.

READ ALSO: Is AI a good study buddy? We asked students

»We’re currently working hard, because the different academic communities are really pushing us for guidance. They are planning the curriculum for 2025 right now, and the use of AI needs to be incorporated into all course descriptions and exam regulations,« says Ruth Horak. She adds that the working group is inspired by other universities, both nationally and internationally.

So far, the group has made a number of recommendations for good academic practice and AI as well as a navigation model for teachers who need to decide whether and how to integrate AI in their teaching. And from autumn 2025, course descriptions will contain a box indicating whether the use of AI is allowed on the course or not.

I recently met a teacher from the study programme in Danish, who said that the entrance of AI was professionally distressing.

Ruth Horak, senior consultant, Education & Students

»We set up the overall framework, and from there it’s up to the different academic environments to assess how it makes sense for them to incorporate it. Some may also choose to continue prohibiting it if they believe that it undermines the very core of the learning objectives,« says Ruth Horak. She explains that some teachers are already busy integrating the new technologies, while others are more hesitant.

»I recently met a teacher from the study programme in Danish, who said that the entrance of AI was professionally distressing in that it is beginning to take over some of the things that have previously been at the core of this professionalism,« says Ruth Horak.

In the working group, the members regularly discuss how to safeguard academic standards on the degree programmes when students can suddenly use AI for a number of tasks that they previously had to do themselves.

»Right now, it’s almost impossible to determine whether AI has been used for an assignment. So in the coming year we will focus on how we can ensure academic integrity and good academic practice,« says Ruth Horak and adds:

»Our starting point is that the vast majority of students do not want to cheat at exams. But right now there can be doubts about what the rules are. And that is why it is so important that we make rules that are completely clear.«

Latest