University Post
University of Copenhagen
Independent of management

Politics

New budget model will distort research

A new measure for Danish university funding will twist research in the wrong direction, University of Copenhagen staff fear

An algorithm for distributing government subsidies will undermine good science. So say critics, who claim that a new scheme will reward mediocre, repetitive research.

The new model for the distribution of funding to Danish universities is to be rolled out in the period 2010-2012. Controversially, it will include a a so-called bibliometric indicator: in this case a weighted model of publications in graded scientific journals.

Revenues allocated for teaching will count for 45 per cent, 20 per cent will be earmarked for external funding, while ten percent will be based on the number of graduated PhDs. The bibliometric indicator measuring a number of scientific publications will from 2012 count for the remaining 25 percent.

This means that the number and grade of scientific publications by university researchers will decide how much of the taxpayers’ money will be given to their faculties and departments.

One result, several publications

The universities’ interest group Universities Denmark have lobbied for the system, congratulating politicians last year on agreeing to a »simple method to distribute government funding (‘basismidler’, ed) on objective and already gathered figures.«

Department heads, deans and professors don’t like it.

The new model will change the way researchers publish, fears Professor Ole John Nielsen, who is head of research at the Department of Chemistry.

»Some might feel tempted to split their research results and publish them in several journals because that will give an immediate economical payoff«, he says in the Department of Chemistry newsletter, adding that in the longer run, this will be costly to their credibility.

Some method is necessary

Faculty Dean for Research, Professor John Mundy shares his fears.

»The raw number of publications can have very little relation to actual quality,« he says in the newsletter, but then adds that it is necessary to have some kind of mechanism to distribute funding.

Quoting the Dean for Social Sciences, Professor Troels Østergaard Sørensen, John Mundy says that »the alternative to a model (such as this) is not ‘no model’, but rather a model which would be very much more sensitive to other and rather more short-sighted political criteria than research quality.«

Alphabet just as good or bad

Bibliometrical systems such as the Danish one, which attempt to measure and reward scientific impact by weighting numbers of publications and grading publications, or other systems that count citations, are given short shrift by statistical experts.

The systems compare apples with pears, explains Andrew Jackson, a professor of theoretical physics at the Niels Bohr Institute to the University Post.

He has co-authored a statistical paper published in the journal Nature that showed that most measures of scientific quality based on production or citation record was just as random as the first letter of their surnames.

Bottom 50 per cent not cited

While quality indices of publication, such as number of citations, fail statistical tests of reliability, quantity measures are just as absurd, he says.

The system puts a large weight on productivity and mediocrity, but cannot discern the ability of researchers. High number of papers will not lead to quality, even if quality is measured by citations: 50 per cent of all papers have two or fewer citations.

»Overwhelmingly, most papers are of no use at all, and I am not impressed by the lowest 50 per cent of the papers. They are of no real value,« he says.

Funding follows old research

»Institutions have a misguided sense of the fairness of the decisions reached by algorithm. Unable to measure what they can maximize, namely quality, they will maximize what they can measure«, he quotes from his own paper.

Apart from skewing research towards quantity, bibliometry models that reward past production and that blindly follow thick citation trails miss out on the really important new stuff.

Institutions reward science in already prominent fields for which earmarked public support is available, »but it is difficult in this way to channel funding into emerging fields,« Andrew Jackson explains.

Not all low category journals are bad

Study-groups under the Ministry of Science have been hard at work in recent years categorising journals into one of two categories: category one journals and category two journals that are rated three times more valuable.

See the ministry’s grading of the journals here.

In the Department of Chemistry’s field, the newsletter writes that there are 30 journals in the top category. There are 456 journals in the lower category.

But not all category one journals are bad journals, according to Chemistry department head Mikael Bols. And this will make for a new kind of deliberation when preparing an article for publication.

»There is a large grey zone where category one and two journals are equally fine from a professional standpoint. So it would be rational to have an eye for the economic consequences of one’s choice«, says Mikael Bols.

»But apart from that, I don’t feel we should let our behaviour be changed by the new budget model«.

miy@adm.ku.dk

Latest