Academic Publishing Interview

Interview with JBC research integrity manager Kaoru Sakabe

This is my interview with Kaoru Sakabe, research integrity manager at the Journal of Biological Chemistry. Will the tough stance on science fraud be abandoned, now that the publisher partnered with Elsevier?

The Journal of Biological Chemistry (JBC), published by the American Society for Biochemistry and Molecular Biology (ASBMB), is known for their no-nonsense attitude to data manipulation. Papers reported to the journal editors for falsified images often get retracted, regardless of how big, important and well-connected the authors are, of how protective their universities and research institutions act, and of how tolerant other publishers have showed themselves.

The editor in charge of data integrity at JBC and ASBMB is Kaoru Sakabe. Below is my interview with her, about the editorial policies and how scientist authors react to being caught with manipulated data.

Even lawsuits don’t seem to scare ASBMB. The publisher is presently being sued by the Pittsburgh pulmonologist Raju Reddy, who wants to overthrow his JBC retraction he himself signed following an institutional misconduct investigation. ASBMB could easily get out of this and save huge sums of money by retracting the retraction, just like another publisher, the Royal Society, stealthily did in a different case, when Jonathan Pruitt‘s lawyer came knocking.

But ASBM and JBC are tough and not backing down. The publisher now partnered with Elsevier and switched to Open Access, but all editorial decisions, including those on retractions, remain with ASBMB and their editorial offices. Whether Elsevier’s legal department will try to meddle nevertheless to avoid cheaters like Reddy suing them, remains to be seen. Elsevier for sure only retracts papers when the editor-in-chief or the academic authors’ own institutions or the authors themselves request it. And sometimes not even then.

This is why the only retraction the French minister for Research and Innovation and former rector of University of Nice, Frederique Vidal, could not prevent, happened at JBC. The Italian neuroscientist Elisabetta Ciani, defended by her University of Bologna and Italian charities, only had to retract papers at JBC and nowhere else. The Australian cancer researcher Levon Khachigian, for whom The Lancet is prepared to bite every critic’s head off, earned first one, then 3 more JBC retractions. That is because sometimes JBC editors retract papers en masse, having screened the works of the same suspect author in their journal.

This happened to the Weizmann Institute cancer researcher Rony Seger, who lost 9 papers in one go, his colleague Yehiel Zick followed with 3 retractions, apparently Weizmann was taken by surprise. The most famous case is probably that of Spanish cancer and ageing researcher Carlos Lopez-Otin, whose 8 papers were retracted by JBC against the opposition from all of Spain. The retractions prompted Spanish science elites and even the local government to threaten violent retaliation against ASBMB, but just before the Spanish Armada was ready to sail off and cannon-ball the publisher’s offices in Rockville, Maryland, chaos ensued when Lopez-Otin first lost a Nature Cell Biology paper, then his Nature mentoring award, and eventually his mind, having started to preach spiritual stuff and to flirt with Opus Dei. St Carlos of Oviedo is now tasked by Spanish government with curing COVID-19, this is not a joke.

The current research integrity policies were established at JBC by the former chief editor Fred Guengerich, professor at Vanderbilt University in Nashville, and the current Editor-in-Chief Lila Gierash, professor at University of Massachusetts Amherst, both in USA.

But now the interview with Kaoru Sakabe (KS), my interview questions are marked with LS.


Kaoru Sakabe:

“I hope that these authors realize that we are going through these steps to ensure that the research we publish is reproducible and transparent and not because we want to annoy them”

Kaoru Sakabe

LS: JBC has a very tough stance on data integrity, not just on submitted manuscripts, but also on published ones, even if decade-old. Could you tell how this policy evolved, and who were the actors, and if it was a slow gradual process or an overnight one? Is it an ASBMB policy which applies to other journals, or a specific policy of JBC?

KS: At JBC, manuscript issues are handled by the Deputy Editor, Fred Guengerich, and if available, the Associate Editor who originally handled the manuscript. Together, they make a decision on the appropriate corrective action. Dr. Guengerich is involved in all decisions related to images and because of his involvement, I believe that we’ve handled these manuscript problems fairly and consistently. 

Dr. Guengerich first became involved when he stepped in as the interim Editor-in-Chief in 2015 when ASBMB was searching for a new Editor in Chief (EIC) for JBC. At the time, we were only reviewing complaints brought to us by readers and reviewers and he was surprised by the types of issues we were seeing. At some point, he requested that we screen images in accepted manuscripts to have an idea of how widespread an issue this was.

In our initial (very small screen), we found that 10% of our accepted manuscript had figures that did not adhere to our editorial policies regarding image manipulation with 1% serious enough to decline.

At that point, Lila Gierasch had joined as EIC. Both Drs. Gierasch and Guengerich knew that image screening was an important step that the journal needed to do to protect the integrity of the research it published; however, one of the big obstacles we had was technology. The manuscript submission system we had at the time was not designed to implement an image screening step before acceptance. After the decision was made to implement image screening, we were planning our transition to eJournalPress, a system which could accommodate this step, so we decided to roll out image screening then. 

It was a very long road to get where we are, but ultimately, I think it has benefited the journal. When we initially started image screening at JBC, we were declining about 4% of our manuscripts that had been tentatively accepted at JBC (there’s more on that process below). This statistic is much lower now at about 1-2%. Also within this time frame, I believe that JBC has made great strides to improve transparency and reproducibility in publishing. At the beginning of my tenure at ASBMB in October 2014, JBC was not providing details on why an article was withdrawn or retracted. Dr. Guengerich realized that this opaqueness was not beneficial for JBC’s readers and he actively advocated for this policy to change. We’ve also developed educational materials for authors. 

While I do not make any editorial decision on any manuscript, I do advise the Editors of all three journals, JBC, MCP, and JLR, to ensure that each journal follows our policies consistently.

LS: How do you do your work? What tools and technology do you use? Any specific AI or other software? Is every submitted paper screened for data irregularities, and at which stage? Before acceptance? Is only image data being scrutinised, or also numerical data, eg Excel files? Presumably also text for plagiarism?

KS: JBC started image screening in 2017.  I have a team of three analysts. They do the heavy lifting as they are the ones primarily screening our papers before acceptance.  One has a background in print production, another in animation, and one is a PhD trained scientist (http://www.jbc.org/content/294/12/4723).

Basically, once the Associate Editor makes the decision to accept the paper, the authors are notified that their manuscript is going to go through an image analysis check.

When we were setting up the process, we consulted other publishers who had already implemented image screening and decided that screening before acceptance would be the preferred method. The analysts aren’t just looking for manipulation/duplications… they’re also making sure that the figures really are publication quality. You’d be surprised at how many figures are submitted to us at 72 or 96 ppi or suffer from imaging artifacts. We’ve seen all sorts of assembly methods and we try to send them down the right path so that future figure files are publication ready. We do not screen all submitted manuscripts as this would be too time intensive and would lengthen the peer review process.

Our main method of evaluating images is Photoshop. We’ve tried a few other software/plugins that are out there and have found that while they may be good at detecting manipulation, they’re not so great at finding duplications. Right now, we don’t have the bandwidth for screening anything except for images. 

Certain manuscripts are run through plagiarism checks using iThenticate.

LS: Where do the hints mostly come from? By email from whistleblowers like Clare Francis? Do you  follow PubPeer or even subscribe to their service for journals? Or do you actively screen past publications?

KS: Concerned readers, some anonymous, bring problematic publications to our attention. While we do think it’s important to correct the literature, we are not actively screening past publications. In fact, we are facing forward and thinking of ways we can help authors (both trainees and mentors) so they don’t have to go through the experience of a retracted or declined (for image issues) manuscript. We have developed an Author’s Resource page (http://jbcresources.asbmb.org/) and we’ve written a series on figure preparation (http://www.asbmb.org/asbmbtoday/collections/DueDiligence/). We also have a few videos on our Youtube channel (https://www.youtube.com/user/TheASBMB/videos). We have a few other projects that are on-going and I hope that we can provide more information on that soon. 

LS: Did you notice a change in submission quality, i.e., how did the policy affect the data integrity of manuscripts submitted to JBC? How would you compare the incidence of data manipulation in submitted papers at the beginning of new policy and now?

KS: I don’t think I can accurately assess that question, but I can tell you that the number of papers we’re declining for image related issues is going down.

When we initially started image screening, we were declining about 4% of our accepted papers (i.e., manuscripts that have gone through peer review and are conditionally accepted, contingent upon an image analysis check). That number is now down to 1-2%.

LS: Did the policy affect the overall number of submissions, or their perceived scientific quality? Would it be correct to assume that JBC lost some authors you don’t really miss and gained those whom you welcome? How about academic editors and peer reviewers, did the policy have an effect there? Did anyone jump ship or, the opposite, was keen to join?

KS: I’d have to say that the image screening process was enthusiastically endorsed by all of the editorial leadership at JBC. They all felt it was something that absolutely needed to be done. Since we’ve started image screening, I think the response has been mainly positive from our authors. Many are thankful that we are checking their papers because we’ve noticed a few errors in their figures or text like a typo or mislabeled panels. You’d think that the authors whose papers are declined would be angry with us, but we’ve found for the most part, the exact opposite.

The authors who were not involved with the manipulations are grateful that these issues did not make it to publication. We’ve had some corresponding authors come back to us later to tell us that they have changed policies within their lab to hopefully curtail this type of issue from happening again.

That isn’t to say that we haven’t annoyed authors with our requests. We are asking authors to include markers and scale bars as well as note all gel splices. We’ve had to convince some authors that there were indeed gel splices in their figures and admonish them for not recording information such as MW markers on their blots. I hope that these authors realize that we are going through these steps to ensure that the research we publish is reproducible and transparent and not because we want to annoy them. 

LS: JBC is unafraid to retract fraudulent papers, even en masse by the same authors. How do authors try to “defend” themselves? Do you receive legal threats? Even letters from lawyers? Would you be able to share some of the most outrageous excuses, explanations or threats you received?

KS: I cannot comment on the first part of your question.

Most of the explanations are the standard: even though the images were manipulated, they do not affect the results or conclusions of the work. Another standard explanation is that the results are completely reproducible. 

LS: What was your overall impression on collaboration with universities and research institutions, in those cases where retractions are decided by JBC only, in absence of any institutional requests or official misconduct findings? Also, when you find problems in a submitted manuscript, does JBC inform the corresponding author’s university? Does JBC blacklist authors who were caught on manipulated data? If yes, how does the blacklisting work in practice?

KS: The only comment I can make is that we follow our policies as stated here: https://www.asbmb.org/journals-news/editorial-policies#ethics. Ultimately, the Editors are concerned about the integrity of the research that is published within the journals and that is usually the guiding factor in their decisions.

The following outcomes may occur in papers found to contain violations of ASBMB editorial policies:

  • submitted manuscripts may be rejected;
  • authors may be asked to correct or withdraw an article after publication;
  • the publisher may retract the article;
  • sanctions may be imposed on the author(s);
  • and/or the matter may be referred to institutional officials and/or funding bodies.

LS: How do other society publishers feel about the JBC policy? Do some consider implementing similar stance (maybe you can even name examples)? Or do you feel they’d rather go in exactly opposite direction, to have things easier? And what about commercial publishers?

KS: I have met two other people in similar positions to myself at two other societies (APS and ASM). We try to meet up every few months or so to exchange ideas. I think the main goal from our three societies is to ensure that the published literature is correct. I cannot comment on the policies of other societies or commercial publishers. 

I know that the big news with regards to ASBMB these days is that we’ve partnered with Elsevier in order to make our journals open access. I can say that one of the major points that was important for ASBMB to retain was the ability to make all editorial decisions for their journals, and that includes how we handle our problematic papers.

LS: Do you think the “quality” of forgeries is improving? Can you keep up with new data manipulation technologies? In the 2000s people mostly faked gel images, what do they manipulate now?  What do you think are presently the weakest points in life sciences, which kind of data is more like a black box where even you cannot look into?  Can mandatory sharing of raw data help there, and does JBC have any policies there?

KS: I think Jana Chrisopher, Elisabeth Bik, and others have done a good job in exposing some of the “higher end” forgeries that are out there- where the backgrounds of an immunoblot are the same, but the bands are different. With regards to the types of data that are manipulated, I can only really comment on images.

I’m sure that other types of data are manipulated, but I in no way have the expertise to catch these. While I do think that mandatory sharing of raw data could help, authors are also known to provide manipulated raw data.

I think a major issue is that trainees are not adequately trained in data retention standards. They’ve only saved an image that has been adjusted, whether appropriately or not, and not the original images nor the native imaging files if they’re using an imaging system. When we’ve requested raw data, some authors can only locate these images that have been adjusted.

ASBMB does have some policies regarding the deposition of raw data. These policies principally affect structural data and ‘omics-type data. We have been in active discussions regarding the mandatory sharing of other types of data, and I hope to provide an update soon.

LS: Dear Kaoru, many thanks to you and ASBMB for your interview.


Donate!

If you are interested to support my work, you can leave here a small tip of $5. Or several of small tips, just increase the amount as you like (2x=€10; 5x=€25). Your generous patronage of my journalism will be most appreciated!

€5.00

3 comments on “Interview with JBC research integrity manager Kaoru Sakabe

  1. Me again

    My big compliments and thanks to JBC, showing that scientific life could be so much better, simpler, and more honest.

    Like

  2. Yes, huge kudos to Kaoru Sakabe and JBC for their integrity and leadership. Thanks to you too Leonid for conducting the interview.

    Liked by 1 person

  3. Pingback: Michael HotTiger of Zurich, patron of biomedical ethics – For Better Science

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: