The cancer research journal Oncogene issued on October 16th 2017 an Editorial on the topic of research integrity:
“The importance of being earnest in post-publication review: scientific fraud and the scourges of anonymity and excuses”.
The editorial contains a list of 8 common excuses dishonest authors used to escape responsibility for manipulated data. It was authored by David Sanders, virologist and professor at the Department of Biological Sciences at the Purdue University in West Lafayette, Indiana, US, as well as Justin Stebbing, professor of cancer medicine and oncology at the Imperial College London, UK, who is also one of the two Editors-in-Chief (EiC) of the journal Oncogene. Sander is one of these rare brave academics who is unafraid to call out scientific misconduct while his peers hide in the bushes and instead even point fingers at whistleblowers like him. As the newspaper USA Today wrote earlier this year, Sanders made himself a very powerful enemy, the star US cancer researcher with Italian origins, Carlo Croce:
“But that didn’t stop Sanders from alleging that Dr. Carlo Croce, a prominent cancer researcher at Ohio State University, falsified data or plagiarized text in more than two dozen articles Croce has authored. For the past two-plus years, Sanders has contacted scientific journals in which the articles appeared to alert them of his concerns. Earlier this month, he went more public with his claims in an investigative piece by the New York Times that delved into years of ethics charges against Croce.
“There are, and I anticipate there will be additional, consequences for my career,” Sanders said Tuesday afternoon while sitting in his office inside the Hockmeyer Hall of Structural Biology at Purdue.
This isn’t the first time Sanders has publicly accused a scientist of bad behavior. In 2012, Sanders had an article by a former colleague retracted on the basis that the colleague used their former deceased research partner’s data in the paper without permission”.
A long article appeared in The New York Times prior to that, “Years of Ethics Charges, but Star Cancer Researcher Gets a Pass“, detailing the case of Carlo Croce and the role of Sanders the whistleblower, and the Ohio State University, who were mostly covering up the affair. Croce hit back: he is now suing the newspaper, and in separate lawsuit, also Sanders at a New York court, as reported by Retraction Watch.
Hence, Sanders knows first-hand what research misconduct is and how to act upon it. Indeed, the editorial was his idea, and his co-author Stebbing joined afterwards. As Sanders wrote to me:
“The impulse for the editorial and the list was from me. We discussed the inclusion of particular items and how they were described together”.
Stebbing indeed is not much of a whistleblower, quite the opposite, he can be in fact seen as victim of such. His own publication was heavily criticised on PubPeer, for suspected western blot band duplications. And the piquant bit is: Stebbings, together with his first author Georgios Giamas (now Reader in Biochemistry at the University of Sussex, UK) offered on PubPeer explanations which sound very much as what he himself has been ridiculing in the Sanders & Stebbing editorial in his journal Oncogene.
[2.11.2017, 18:40: a section referring to a newspaper article dealing with clinical events was removed upon email request from Justin Stebbing. He announced to me also to address the PubPeer concerns and provide original gel images in the comment section below]
This was the above-mentioned paper:
Georgios Giamas, Leandro Castellano, Qin Feng, Uwe Knippschild, Jimmy Jacob, Ross S. Thomas, R. Charles Coombes, Carolyn L. Smith, Long R. Jiao, Justin Stebbing
CK1delta modulates the transcriptional activity of ERalpha via AIB1 in an estrogen-dependent manner and regulates ERalpha-AIB1 interactions
Nucleic Acids Research (2009) doi: 10.1093/nar/gkp136
“I also would like to thank you very much for your critical comments. I will contact the corresponding authors. Thereafter, I will reply as soon as possible”.
There was no follow up from Knippschild’s side about this paper, but Giamas and Stebbing did engage in a debate with their critics.
Now follows the list of the lame authors’ excuses from Sanders & Stebbing, Oncogene 2017 (copyright NatureSpringer), and the justifications offered by Giamas and Stebbing on PubPeer regarding their own paper in Nucleic Acids Research (NAR). The comparison is illustrated with PubPeer evidence.
i. ‘Nothing to see here. Move along.’ Even though the evidence of image duplication or plagiarism is in many cases overwhelming, some authors refuse to admit that there was any problem with their article.
Giamas and Stebbing on PubPeer that there is indeed no reason to suspect duplciations:
“Amongst the thousands gels / blots / immunofluorescence and other experiments that I (GG) have personally performed the last ~17 years (as I am sure it has happened for other scientists) there were occasions that certain results/data were indeed ‘strangely similar’ (i.e. protein bands running in an identical way, 2 different cells looking alike under the microscope, etc)”.
ii. ‘My dog ate the data.’ Certainly having the original data would help resolve the issues and clearly this excuse has greater validity as more time passes. But sometimes the image manipulation/plagiarism is so evident, that the lack of the original data cannot be an exonerating circumstance.
The Imperial College mandates: “Primary data is the property of Imperial College and should remain in the laboratory where it was generated for as long as reference needs to be made to it and for no less than ten years“. Giamas and Stebbing suggested however on PubPeer that the 8 year-old data might be unavailable:
“Unfortunately, as you can imagine, due to the fact that these data are quite old, it will be very difficult (if not impossible) to trace the original blots and recall who exactly was involved in the execution of the specific experiments”.
iii. ‘If you look hard enough, you can find a trivial difference between two supposedly duplicated images.’ First, the standard should be how likely is it that two images could be so similar and yet have distinct origins. Artifacts that can introduce small differences can occur during image processing. Also, different exposures of the same data can produce apparent image differences; again the standard should be about the probability of similarity.
According to Stebbing, similar-shaped bands can indeed be produced by chance, when the same blotting apparatus is used (in fact, their German colleague Roland Lill suggested the same very recently, also on PubPeer). As Giamas and Stebbing explained on PubPeer:
“In this case, whether this was due to a technical issue (for example, the quality of the SDS-PRECAST-GEL used, or whether during the semi-dry blot transfer something went wrong, or something else…), as I mentioned before it is difficult to be 100% certain as it is impossible to recall how the exact blot(s) were executed ~8 years ago. Interestingly, we recently had an incident using a semi-dry blot device, where there was a problem with the upper stainless-steel plate (surface/electrodes) of the equipment. As a result, ALL our membranes were coming up with the same ‘background’ signal (noise) that affected the proper visualisation/analysis of the proteins run in different lanes (there was actually a sort of an identical spotted (marked) pattern in all of the protein bands making them look ‘strangely similar’ (whether this was due to something that was previously stuck in the steel surface, or a problem with the specific part affecting the current flow, heat generation etc…”.
iv. ‘It was only a control experiment.’ How many scientists have not had an unexpected result in a ‘control’ experiment that actually led to some insight? If control experiments were unimportant, why were they included in the article in the first place? Connected to this sophistry is: ‘The data duplication does not affect the results.’ The said error may not affect the main conclusions of the research but all data presented should be considered results. Moreover, identified errors, especially if they occur more than once in a single paper or in several papers by the same author(s) undermine the trust of the Editors in any results presented by the author(s). See the Lady Bracknell quotation.
v. ‘It was the fault of a junior researcher.’ This could very well be true. It is sad when the research of a laboratory group is undermined by one unscrupulous person. However it remains to be asked, how did such obvious image duplications escape the attention of the other co-authors? To qualify as an author of a paper one must have approved the final version. If research misconduct was not identified then this does not reflect well on the integrity of, and care and attention paid by the co-authors.
vi. ‘The responsible researcher is from another country and therefore unfamiliar with the standards expected in scientific publications.’ First, of course, this argument is highly insulting to the many researchers from other countries who do not engage in such activities. Second, if a laboratory director is concerned about the understanding of standards by researchers in one’s group from other countries, then one is responsible for inculcating the proper values into those researchers and displaying an extra level of scrutiny of their products. Again, see the Lady Bracknell quotation.
A reply Giamas left on PubPeer regarding his common paper with Stebbing in Oncogene (Stebbing et al 2015) covered all 3 above points 4-6. It assigned the blame for a failed loading control to a junior co-author from China:
“Regarding the ‘actin’ labelling in the KSR1 paper, our previous postdoc student working on this project (Dr Hua Zhang) has confirmed that this is a mislabelled blot. Indeed this actin blot is representative of the T47D cell line (and NOT the MDA231). However, the relative MDA231 blot looks relative similar, meaning that equal and therefore comparable protein amounts have been loaded and therefore the interpretations/conclusions are the same (as most actin blots should look like following proper bradford quantification). Apparently, during the preparation of the supl. figure, he accidentally used the wrong actin blot and we apologise for this”.
vii. ‘The results have been replicated by ourselves or others, so the image manipulation is irrelevant.’ Data are included in an article for a reason. Science is based upon a certain level of trust, but it is not all-encompassing. If the data do not represent the experiments described, then that trust has been violated, and no rationalization about final outcomes should affect judgment about the culpability of the authors.
Indeed, Giamas and Stebbing opened their defence on PubPeer with declaring that all results were reproduced, which probably means any eventual data manipulations become utterly irrelevant:
“First of all, let me confirm that we have repeated these experiments (as we do for every single one) at least 3x times; therefore, the results/conclusions presented in this manuscript are valid and reproducible”.
viii. ‘Someone is out to get me.’ Perhaps true but irrelevant. By implying that if not for the fact that one was being targeted, the behavior would be considered acceptable, one traduces the entire scientific community. Such practices are neither common nor worthy of toleration.
Possibly in a similar vein, Giamas commented on PubPeer:
“I feel honoured (in a way) that you spent your time, going through all my publications to date. Thank you for pointing out things that potentially require further clarification/corrections.
I can assure you that we will carefully look into these ASAP and proceed with any corringedum / erratum requested by the respective journal.
More importantly, I want to re-emphasise the accuracy and scientific integrity of our published data/results, using the specific protocols, reagents, etc employed and referenced in each case”.
The Sanders & Stebbing editorial contains this bit, one wonders if Stebbing put it in, with his own papers in mind:
“Some accusations are clearly false, but it is the responsibility of the journal to investigate all allegations made. A few of the excuses listed above may occasionally be valid in some context”.
Certainly he will ruthlessly investigate the issues in Stebbing et al 2015, being the Editor-in-Chief of the journal Oncogene where it was published. And as for his paper in Nucleic Acids Research: the journal showed little evidence of any interest to do anything at all about a similarly problematic case from France. “Nothing to see here. Move along.”
Update 3.11.2017. I exchanged several emails with Justin Stebbings, where he indicated to be inclined to share the original gel scans. In the email one he forwarded to me this communication he received from an editor of Nucleic Acids Research:
“I have finally taken the time to review your response. Thank you for taking the time to produce such a detailed report and for repeating the experiments. We are satisfied with your response and the evidence provided and agree that the figures have not been unethically altered. We now consider this case closed”.
If you are interested to support my work, you can leave here a small tip of $5. Or several of small tips, just increase the amount as you like (2x=€10; 5x=€25). Your generous patronage of my journalism, however small it appears to you, will greatly help me with my legal costs.