Rethinking Gatekeeping in Science: Q&A with eLife’s Timothy Behrens

May 22, 2025
by Miriam Frankel
Rethinking Gatekeeping in Science: Q&A with eLife’s Timothy Behrens
Trialing ‘post-publication peer review’ in the biomedical and life sciences.
by Miriam Frankel
May 22, 2025
The scientific publishing industry has followed the same review process for assessing the quality of research papers for over a century: Academics submit their manuscript to a journal and the journal’s editors select two or three independent subject experts to review the paper’s quality. These peer reviewers write reports approving the paper for publication, asking for modifications, or outright rejecting the work. Only after the paper has passed this often lengthy peer-review process, will it be published. So the reviewers hold a huge amount of power, serving as gatekeepers of what should be regarded as good science.

The biomedical and life sciences journal eLife also followed this traditional model for handling manuscripts when it was founded in 2012, by Nobel Laureate physiologist Randy Schekman, of the University of California, Berkeley. But in 2023, the not-for-profit open-access journal began experimenting with a radically different approach in which the journal editors alone decide whether to publish a paper, based on their assessment of its quality. Upon acceptance, the editors send the paper out for review, with the aim of publishing the preprint, along with the review reports and later revisions, on the eLife site—no matter what the peer reviewers eventually say—as long as the authors are happy to do so. When the authors are satisfied with their revisions, they can end the review process, creating their version of record. eLife’s editor-in-chief, Timothy Behrens, a neuroscientist at the University of Oxford, UK, talks to Miriam Frankel about his experience of trying to shake up the centuries-old journal system with this ‘post-publication peer-review’ process.

Following its launch in 2012, eLife established itself as a successful and respected journal in the biomedical and life sciences, by following the traditional peer-review model. For instance, it gained a high “impact factor” (an independent measure of the citations received, on average, by articles in a journal, annually). Why change things up so dramatically in 2023, when things were working so well?

eLife originally started off with an intention of playing the same game as the rest of the publishers—so using peer review to select papers—but doing it in an open, transparent way. And we were successful, getting 600-700 papers a month. But I used to be a senior editor under the old model and the one thing I used to hate is that the authors have been through peer review and it’s taken them months, sometimes years. And then, as editors, we’re sat there with a bunch of peer-review reports which say the work is ‘basically okay, but it’s a bit boring,’ or ‘they haven’t quite proven exactly what they’re claiming.’ And then you end up rejecting the paper and the author has got to start again. I find that so stressful. I love having got rid of that.

We decided to do a much more transformational thing, which was to get rid of peer review as gatekeeping.
- Timothy Behrens
We decided to do a much more transformational thing, which was to get rid of peer review as gatekeeping. We publish everything that we review, along with the reviews, even if the peer reviewers don’t think that the work is solid or strong. It’s a way of having an open conversation, with a much softer set of selection criteria.

What was the immediate reaction from potential authors? Were they more or less keen to submit? Did you get a sudden influx of low-quality papers?

In terms of submissions, we did not see a big hit when we introduced this model. We were back up to about 600 submissions a month pretty quickly. We did get quite a lot of submissions, mostly from China, that were clearly below quality. I think they probably thought this might be an easy way to get into a high-impact-factor journal.

Have there been different responses from scientists in different sub-disciplines?

Yes, it is perhaps most obvious in computational fields, which often overlap heavily with biology. Computer science and machine learning rely heavily on preprint servers such as arXiv, and where they touch on biology, people are very comfortable indeed with the eLife model.

The journal’s impact factor is calculated by the British-American analytics company Clarivate. The company announced at the end of last year that it will not be issuing eLife with an impact factor in 2025 because, by definition, the journal cannot guarantee that the papers have passed peer review at the point of publication. How did eLife feel about this decision?

The Problems of Peer Review

Ivan Oransky of Retraction Watch and arXiv discusses whether science publishing is broken.

LISTEN:
Full Podcast
We think it would have been perfectly possible for Clarivate to index us in the same way as before under this new model and we offered proposals for them to do so. But they said that if we are publishing things which we don’t think are solid, then those things are not validated by peer review and so it won’t fit in the model. We think this is a mistake. It sort of tacitly states that they think all science is complete. So when we publish something and say it’s incomplete—which all science is, by the way—then they think that that’s a damning thing.

How have researchers responded to the loss of impact factor?

In the short term, the loss of our impact factor caused a hit in submissions and we are now publishing about two-thirds of the volume we published before. We now get very few low-quality papers from authors who previously might have thought this was an easy route to publishing in a high-impact journal. In some sense, this is one of the good things about losing our impact factor. If anything, the quality of the work that we’ve had since losing our impact factor is higher than it ever has been.

Were there differences in response to the loss of impact factor from scientists based in different countries?

Yes there was a very big difference geographically. I think this relates to how the impact factor is treated in the academies of different countries. So in the UK, for example, our submissions have increased since we lost the impact factor because the UK, in general, is pretty progressive. Almost all UK universities and funding agencies have signed DORA, the Declaration of Research Assessment. This is where a funder says they won’t look at the impact factor or at the journal name when they’re evaluating science, they’ll look at the science itself.

When we publish something and say it’s incomplete—which all science is, by the way—then Clarivate thinks that that’s a damning thing.
- Timothy Behrens
We saw a drop of 30 or 40 percent from the US, but this seems to be recovering slowly. The really big research-heavy institutions in America are quite like the European ones. MIT said they don’t really care about impact factors. Whereas there are many universities, often state universities, who don’t feel like they’re as qualified to evaluate their own people based on the science, and therefore rely more on journal metrics.

Do you think that the model puts added strain on authors, reviewers, or editors? And do you think that they should be paid, as a result?

It does take a little more time from the reviewers and the editors. They’re being more responsible because it’s all done in public. Are there stressful events for the authors, when people publicly say they disagree with the work? I think it’s super rare, but it probably does happen.

I think the editors and reviewers should be paid. At eLife, we are dealing with one major change at a time, but I think that there’s a good chance we’ll trial paying peer reviewers in the future. It might mean having a higher article processing charge (currently a US$2,500 publication fee paid by authors on publication) or a different funding model.

How is eLife currently funded?

Our article processing charges fund some portion of our operation, but we also have central backing, which comes from what was originally our three founding charitable partners: the Max Planck Institute, the Howard Hughes Medical Institute, and the Wellcome Trust. There are also some new funders, such as the Knut and Alice Wallenberg Foundation, that have come on, along the way.

There have been some well-publicized controversies surrounding eLife over the past 5 years. When the new model was announced there were serious concerns about the timeline for the transition, with some, including Randy Schekman, threatening to resign. How did eLife eventually navigate that turmoil?

I was deputy editor at the time. It was very complicated. I do think that the editor-in-chief at the time, Mike Eisen, had to be stubborn to get a pure version of it through. I think that a compromise version of it—with some papers going through traditional peer review, while others trialed the new model—would not have been a success. But that wasn’t communicated well with the editors and so some editors left and made a big scene about it, because they thought that they were losing the old version of eLife, without them having any say. But let’s just temper and be honest about what happened: There were 600 editors, and about 40 or 50 resigned. I also mostly think that those people are back on side. Several of them have rejoined.

eLife had such an amazing following that when we did something wild, people went with us.
- Timothy Behrens
There was also some well-publicized disruption in October 2023, when then editor-in-chief Michael Eisen was removed from eLife for reposting an Onion headline about Gaza, on social media. His removal triggered multiple resignations and a petition in support of both him and freedom of expression in academia, in general, signed by almost 2,000 life scientists. Can you talk about how eLife handled that situation?

It was extremely difficult. Because it was about Gaza, which is such a politically difficult situation, it was very, very, very hard to handle as a journal. There were sets of people with extremely complicated views on both sides. But eLife is not a political organization; it’s a scientific organization and people realize that. Mostly people see eLife as a force for good in science. We have basically recovered.

Do you have any advice for other scientists who want to start journals or innovate peer review?

Just try it and see. Get a community of people who really think you’re doing the right thing. eLife had such an amazing following that when we did something wild, people went with us. If you have good ambitions, like transparency and openness, then people will go with you.