From time to time, scientists submitting their work for publication encounter a request from editors to cite some random earlier papers from same journal. Why? One reason: it raises the impact factor. In fact, for certain journals it is even the unofficial rule that such journal-self-citations are expected, or your paper will be rejected. Some scientists abide in advance, to make editors happy. Most others struggle with the concept which they find unethical. The German editor of the Springer journal Statistical Papers will explain to you here why this is the scientifically correct and perfectly objective way to run a journal.
A discussion was raised on Twitter recently, in the course of which neither the journal not the editor was named. Mark Hayter, professor of nursing at University of Hull in UK and a journal editor himself, tweeted:
“A PhD student of mine had a paper accepted – one condition of acceptance was that she reviews her references and includes any relevant recent papers from the accepting journal”
He then added the journal was “Not predatory. Well known journal, member of COPE and from a large,international publishing house” and also specified that “They asked her to review her references and include ‘recent, relevant’ papers from the accepting Journal. No specific papers were suggested“. Turned out, Hayter was not alone with that experience:
More anecdotes arrived, like this one from the area of medicine:
Most of the replies were critical, like this advice from the Hindawi research integrity manager Matt Hodgkinson:
For all we know, the authors might have appreciated the Twitter outrage and then just did what the editor said and quoted some random papers from the journal. Why making enemies, instead of making papers? Some scientists showed even understanding for the policy:
Now Professor Stephen John Senn of Luxembourg Institute of Health is a statistician, he will surely agree that the following policy of the journal Statistical Papers is fine as it is. I mean, if your work is written in the form of a paper and it is about statistics, you sure must cite something from this particularly significant journal, what with the name, “Statistical Papers“, right?
This was the email a reader forwarded to me, a recently received reply to his rejected manuscript submission:
your paper has some merits. However, given the enormous number of submissions we are receiving recently we have decided to focus on papers which are related to previous work published in our journal. And this does not seem to be the case with your paper since you are not citing articles of Statistical Papers. Moreover, the reference list is not of good quality: sometimes the pages of the journal articles are missing.
Thank you for giving us the opportunity to consider your work.
Christine H. Müller
Editor-in-Chief, Statistical Papers
I contacted the EiC Christine Müller, professor of statistics in engineering at the Technical University of Dortmund (TU Dortmund) in Germany. She replied, confirming the email authenticity:
“Due to the high amount of submissions, we have to set strict standards, and two of them are the quality of the paper and the relationship to other papers of our journal. If the quality is ok and only Statistical Papers is not cited then we usually ask for a resubmission. However, here the quality, indicated by the reference list, seems to be questionable.”
I was unconvinced this practice had nothing to do with the Journal Impact Factor (currently at 1.345 for Statistical Papers) and also puzzled how the editors could judge a manuscript solely on the basis of its reference formatting (“page numbers missing”). Müller then clarified:
“we intend to make sure that submissions fit to the journal and a good indicator is usually how well it is connected to previous work in our journal. Note that we generally do not judge that solely by whether another SP-paper is cited or not as you may see from checking our published articles (the self-citation rate of SP is not higher than that of comparable journals and you may be aware that anyway only cites of within 3 years affect the IF).
Of course the quality of a paper is not judged by the reference formatting. However, we have the experience that a sloppy reference list is an indicator of a sloppy written paper. We think that editors of other journals will have the same experience and will make similar conclusions. Hence the remark on the reference section was intended as a service to the author.”
That email was signed by Christine Müller and the other two chief editors, Carsten Jentsch, professor of statistics in economics at same TU Dortmund, and Werner Müller, professor at Institute for Applied Statistics at University of Linz, Austria.
The journal’s authors seem to abide by these unofficial editorial rules. I looked at the first 3 recently published studies in Statistical Papers (all incidentally from China), one references 4 papers there, another references 2, third references 1 paper in same journal. But is the scope of Statistical Papers really that narrow? This is what the journal website states in this regard:
“Statistical Papers provides a forum for the presentation and critical assessment of statistical methods. In particular, the journal encourages the discussion of methodological foundations as well as potential applications.
This journal stresses statistical methods that have broad applications; however, it does give special attention to statistical methods that are relevant to the economic and social sciences. In addition to original research papers, readers will find survey articles, short notes, reports on statistical software, problem section, and book reviews”
Nowhere it is mentioned that the submissions must cite some random past papers in same journal to fit the scope. The assigned publisher executive from Springer chose not to reply to my emails, and why should they. The editors do their best to boost the journal’s citation index.
But for argument’s sake, if Statistical Papers is its own separate field research, surely the Editor-in-Chief will be expert for the specific science area of “Statistical Papers”? Unfortunately, she is not really. A long list of publications is posted by Christine Müller on her TU Dortmund website, from 1984 till now, presumably her entire research output, since not otherwise specified. Yet merely two of Müller’s statistical papers appeared in her journal Statistical Papers, which is published since 1960 (until 1995, even in German). Her namesake editor colleague Werner Müller also has merely two papers in this journal to show, while Jentsch does not list a single publication in Statistical Papers on his website.
Basically, they are field outsiders of the obscure niche discipline Science of Statistical Papers, having barely (or not at all) published there themselves. Or maybe their own journal’s impact factor is too low and needs boosting before Müller, Müller & Jentsch consider it as a venue?
If you had similar experiences with editors imposing own-journal citation requests, please consider sharing these below in the comment section.
If you are interested to support my work, you can leave here a small tip of $5. Or several of small tips, just increase the amount as you like (2x=€10; 5x=€25). Your generous patronage of my journalism, however small it appears to you, will greatly help me with my legal costs.
My article was desk rejected in 50 minutes after submission in a springer journal in 2016. The reason was out of scope. I did not understand the reason first.
Since they have a lot of articles about intrusion detection and machine learning in their journal.
Then, I realized that my article has no references to submitted journal. This is actually a very common policy.
This is the submitted article in question.
Well. I was grateful for quick decision. That article by the way has 39 citations in google scholar about 20+ citations from sci index journals even though it is a preprint.
Below are emails:
Wed, Apr 13, 2016, 8:40 AM
Dear Mr. Ozgur,
Thank you for approving the changes the Editor made to your submission entitled “A Review of KDD99 Dataset Usage in Intrusion Detection and Machine Learning between 2010 and 2015”.
You will be able to check on the progress of your paper by logging on to Editorial Manager as an author. The URL is http://XXXX.edmgr.com/.
Thank you for submitting your work to this journal.
Wed, Apr 13, 2016, 9:33 AM
Dear Mr. Ozgur:
We have received the reports from our advisors on your manuscript, “A Review of KDD99 Dataset Usage in Intrusion Detection and Machine Learning between 2010 and 2015”.
With regret, I must inform you that, based on the advice received, the Editor-in-Chief has decided that your manuscript cannot be accepted for publication in Journal of XXX.
Attached, please find the reviewer comments for your perusal.
I would like to thank you very much for forwarding your manuscript to us for consideration and wish you every success in finding an alternative place of publication.
Comments for the Author:
The topic of your paper is outside the scope and interest of XXXX.
Pingback: Emergenze croniche - Ocasapiens - Blog - Repubblica.it
Pingback: Editorial gatekeepers mind their own goals – For Better Science
Honestly, I would be more concerned if they had a long list of publications in their own journal…
So the citations they ask for are completely unspecific, and do not go to their own work.
The editors don’t benefit much from these citations. They will not get more money or much more reputation from a slight difference in IF. They need to find reviewers, and need to screen and desk-reject a lot of junk. Because if they pass too much junk to the reviewers, they will stop reviewing in the future! It matters much more to the publishers that then sell the subscriptions, which is why they are so extremely slow at retracting bad contributions. There are many reports of MDPI for example pushing editors and reviewers to accept just about anything…
I’ve had reviewers ask for citations to work that was not even published when the manuscript was submitted… obviously, the “expert” reviewer was also just a PhD student abusing this to promote his own publications…
Do you have more information about those MDPI reports? Please contact me by email, under a functioning address
Only from the media:
“All 10 senior editors of the open-access journal Nutrients resigned last month, alleging that the publisher, the Multidisciplinary Digital Publishing Institute (MDPI), pressured them to accept manuscripts of mediocre quality and importance.”
Perhaps one “solution” would be to cite only retracted papers from the journal – I note that Statistical Papers has had at least one paper retracted…see what that does to their reputation!