Feature: A transition in publishing
Editorial: All that glitters is not gold
It’s quality that should count when assessing researchers, not their number of publications or impact factor. But this ideal is still far removed from reality, as Joint Editor-in-Chief Judith Hochstrasser explains.
The science publication system has its questionable aspects. The more articles a researcher can publish in recognised journals, the greater their reputation. But do they actually have the time and space to engage in a project to a point at which they can finally say: “Now I can publish”? Or ought they to publish earlier, because if they’re not publishing, they’re not visible?
This system can also lead to absurdities. The more often an article in a specialist journal is cited by colleagues, the more its author’s reputation rises. But if just one colleague refers to that article once, he has already laid out a trail for other researchers who will see this reference and copy it. This is much more efficient for them than if they read all the publications related to their own research topic themselves, and cite the relevant passages. This ‘Matthew effect’ also has an impact on citations. More new citations are created by earlier citations than from researchers actually reading and citing the original article.
These problems are well-known. This is what lies behind the so-called DORA principles that should be observed when making appointments at universities. They state that priority should not be given to those who have published in major journals such as ‘Nature’ or ‘Science’, but to those who have carried out convincing research. This is an ideal to which everyone should aspire, though it seems to be still far removed from what actually happens. All the same, there are currently attempts to implement DORA – such as at the SNSF, which is testing a new CV format in biology and medicine. But as Ambrogio Fasoli from EPFL admits in our Focus, for example, many professors remain keen on the impact factor, and it is all but impossible to control how they recruit their own research teams. And Rachel Grange of ETH Zurich confesses: “I always say: quality counts, but unfortunately so does quantity”.
So there is indeed an absurd publication bubble, and a gap between what’s ideal and what is actually practised. It’s time for Horizons to take a closer look! And it’s high time that decision-makers in the academy stop talking, and start acting!