Over 20% of researchers have been pressured by journal editors to modify their articles in ways that manipulate the reputation of the journal.
Journals are ranked by the citation rates of the articles they publish. Editors can manipulate their journal’s ranking by asking authors to include more citations of other articles in that very journal.
An editor of Leukemia wrote to an author whose work was about to be accepted. “You cite Leukemia once in 42 references. Consequently, we kindly ask you to add [more] references of articles published in Leukemia to your present article.”
These data recently appeared in Science, where Allan W. Wilhite and Erica A. Fong dubbed the phenomenon Coercive Citation in Academic Publishing.
While 80% of researchers say that coercive citation reduces the prestige of a journal in their eyes, 60% nonetheless admit that they would add citations from such a journal to their reference list before submitting their article to it.
This practice can be stopped by changing how we calculate a journal’s impact factor. Impact factors reflect average citation rates for articles; a high impact factor shows that a journal is important in its field.
When we determine impact factors, we should simply exclude citations appearing in the journal at hand. If the impact factor of Leukemia were computed without reference lists from articles in Leukemia itself, nothing could be gained from coercive citation.
Would this give a skewed picture of the relative importance of journals? It’s true that curiousity-driven research leads to specialization so narrow that only a few journals would be interested in any particular article. As a result, new findings in some sub-sub-sub-field — which is where researchers work — have very few potential outlets. But this is true for everyone and almost all journals, such that it shouldn’t lead to unreasonably skewed citation indeces.
Another possible fix is advocated by John G. Lynch, also in Science. Lynch organized several editors of leading journals in his field to write a joint letter to 600 deans, identifying the practice of coercive citation and its potential damage to the field. These editors encouraged deans to evaluate the quality of their faculty members’ papers based on the articles themselves rather than the impact factor of the journals in which they appear.
And, indeed, Lynch is right that evaluation and funding cultures provide the context for coercive citation. When promotions are based on publication in journals with high impact factors, the journal editors have a motivation to get the best impact factor they can because that will let them attract the best articles from up-and-coming researchers. There’s an incentive to game the system.
When governments connect funding for universities to the number of publications in different tiers of journals — as the Norwegian government does — the lure of corruption is introduced.
Universities carry out basic research that takes many years. Elected officials operate on shorter cycles; politicians want to give money to research and then see results during their relatively short period in office. The legitimate priorities of universities and politicians are therefore at times in conflict.
Attempts to resolve the conflict — primarily about how long it takes to get results — give rise to systems based on metrics, on counting. And systems based on counting can be gamed.
The game we learned about from Wilhite and Fong — the game of coercive citation — can be fixed. Doing so will strengthen our confidence in the system.
That way, when we have good results, we can try to publish them in the best possible journal, confident that the quality of the journal reflects the quality of the research others have published there, and not just the vastness of their reference lists.
For more recent writing on gaming scholarly publication, check out the DrugMonkey blog at scientopia, a recent post at the scholarly kitchen, and S. Scott Graham’s blog entry on citation coercion.
See also:
Opacity in scientific publication: do journals disciminate?
Breakthrough knowledge: research, education and universities
Share
1 Comment
6 Trackbacks
- Scientific Publishing, improved by whale hunting
- More on padding those impact factors. « Åse Fixes Science
- Open Evaluation: 11 sure steps – and 2 maybes – towards a new approach to peer review
- Why you can’t trust research: 3 problems with the quality of science
- “Impact factor is a scam”, argues Curt Rice | Achilleas Kostoulas
- Quality control in research: the mysterious case of the bouncing impact factor - Curt Rice
Republish
I encourage you to republish this article online and in print, under the following conditions.
- You have to credit the author.
- If you’re republishing online, you must use our page view counter and link to its appearance here (included in the bottom of the HTML code), and include links from the story. In short, this means you should grab the html code below the post and use all of it.
- Unless otherwise noted, all my pieces here have a Creative Commons Attribution licence -- CC BY 4.0 -- and you must follow the (extremely minimal) conditions of that license.
- Keeping all this in mind, please take this work and spread it wherever it suits you to do so!
Very interesting article. No joke, I received the following email a few months ago from the editorial office of Leukemia one day before receiving the rejection letter:
Dear Dr. Freud,
Our publisher has warned us that at the opposite of other Editors we rank very low in self-citations. Of course, it is better than ranking too high in self-citations but a good average would satisfy us well. Having my attention drawn to this problem, I indeed noticed that, for example, you have 0 Leukemia citations in your paper. It can be understandable for a journal which is not well-known, not well-documented that no citations can be found but it is difficult to accept that probably due to absentmindedness you never looked for Leukemia in your references. For example, we are well known in the myeloproliferative diseases and we have even done a long Spotlight on the topic. The same for MDS and yet none or very few citations can be found in your paper. Rather, as I may say, you do like the cuckoo which drops its egg in the nest of Leukemia and then fly away to better known heavens. All Editors survive thanks to reasonable communication therefore we would like to be, in this case, proactive and that you give us what belongs to Caesar and no more. Otherwise, in a way, this is blocking the development of Leukemia and particularly it blocks our network which we do not deserve due to the extreme care with which this journal is led and the care we invest in it for the benefit of all.
Kind Regards,
Lauren Weinberg
Leukemia Editorial Office
Leukemia – The Journal of Normal and Malignant Hemopoiesis
The Macmillan Building
4 Crinan Street
London
N1 9XW
Tel: +44 (0)20 7843 4870
Fax: +44 (0)1256 321531
leukemia@leukemianature.com
2011 Impact Factor 9.561