When Simon Gosling heard about a competition offering a grand prize of $5000 for ideas to improve peer review in 2012, his experiences as an early-career scientist motivated him to enter. “One of the things I spent a lot of time doing at that stage was writing reviews,” says Gosling, who was an assistant professor in the School of Geography at the University of Nottingham in the United Kingdom at the time. (He has since been promoted to associate professor.) “Having gone through the processes of … trying to improve my CV and make myself attractive to employers, as you do as an early-career scientist, one of the things I found was that I struggled to demonstrate how much time and effort I’d actually put into the peer reviewing process.”
So Gosling leapt at the opportunity to participate in Elsevier’s Peer Review Challenge, proposing a system to track the reviews scientists complete and reward them for their efforts. It would work like this: Reviewers would receive various levels of “Elsevier Badges” that would appear on their online profiles and as certificates and would bring with them “Elsevier Rewards,” consisting of discounts on Elsevier books and other products. “One of the main points was to give a benefit back to the reviewers, in terms of giving them something tangible that they could put on a CV, or mention in a job interview, or put on an application form,” he says. “As an academic you’re expected to review anyway, but I still thought it’s nice to get some recognition for that.” The competition’s judges named Gosling the winner, and Elsevier liked his idea so much that they used his proposal as a jumping off point to develop the Reviewer Recognition Platform, launched in 2014.
Not all researchers think that tracking peer reviews will lead to more recognition for scientists’ reviewing activities. But the Reviewer Recognition Platform and other products exploring this area at least make peer-review quantity easier to track, and could also begin to get at the question of review quality. Free for academics, these services could turn reviews into meaningful metrics and enable scientists to get credit for the time and effort they spend on this crucial but often underappreciated contribution to the research community.
Making reviews count
Scientists around the world volunteer their time reviewing others’ manuscripts to help editors make publication decisions. This work can be time-consuming and labor intensive, and reviewers are essentially sacrificing time and energy they could otherwise be spending on their own research and the standard metrics of success, such as papers and grants. Despite this trade-off, reviews typically aren’t valued in important career-related decisions, such as hiring and promotions. Reviewers may list the journals they have reviewed for on their CV as a signal that their expertise is being recognized, and some journals list their reviewers or offer them discounts, but it’s rare for scientists to get much systematic or public recognition for their reviewing efforts.
That may explain why Gosling’s proposal struck a chord. The resulting Reviewer Recognition Platform allows scientists to track their 5-year review history on a profile page, which they can choose to keep private for their own records or to make public. Regardless of whether the profile is private or public, scientists receive various perks based on the number of reviews they’ve completed, including badges for being a “recognized” or “outstanding reviewer,” review history reports, certificates of recognition, and discounts at Elsevier’s online stores. “You get a tangible benefit for the time you’re putting into it,” Gosling explains. (Gosling was not directly involved in the platform’s development, but was informed about its progress and occasionally asked for his feedback.)
You get a tangible benefit for the time you’re putting into it.
In 2014, the platform launched a beta version that automatically adds review records for about 40 Elsevier journals and has been steadily adding more since, with more than 800 journals and 400,000 profiles to date. Users can manually add reviews submitted to non-Elsevier journals, with a verification method planned to be introduced later. The aim is that the platform will eventually automatically include reviews submitted to all Elsevier journals, and possibly non-Elsevier journals as well.
Publons, on the other hand, has supported reviews from any journal since its launch in 2013. (The startup’s name is a joking reference to the “publon,” or the smallest publishable unit.) Similar to Elsevier’s platform, the site—which currently boasts more than 50,000 users—allows researchers to post online profiles with bios and peer-review histories. Profiles can be kept private or can publicly display various levels of detail about each review, such as the year, publisher name, journal name, article title, or even the full text of the review—subject to journal policy. Users can also write postpublication reviews for any article.
Users receive Publons “merits” for their reviews and are incentivized by a leaderboard, with quarterly awards for the top three reviewers overall and the top reviewers in particular fields and institutions. Rewards have included software licenses; free or discounted access to services such as Mendeley (a hybrid reference manager and social networking site), GitHub (a software code management tool), and Amazon Web Services (a cloud computing service); and Publons t-shirts.
“Publons makes it very easy to keep a record,” user Elisabeth Bik writes in an email to Science Careers. Bik, a research associate at Stanford University in Palo Alto, California, and a Publons adviser, meaning that she is occasionally asked to give her feedback on new features, previously used a spreadsheet to keep track of the reviews she completed, but she didn’t always have time to keep it updated, she writes. Now, “you just forward the ‘thank you for reviewing this paper’ email from the journal to Publons, and then it gets stored on your account automatically,” she says. Reviews can also be entered manually through the site, or be automatically added from several journals that have partnered with Publons.
By tracking her reviews on Publons and comparing notes with co-workers, Bik realized that she wrote more, and much longer, reviews than many others. “I feel more confident knowing that I put my weight in in terms of peer review,” Bik writes. She has considered writing shorter or fewer reviews but hasn’t done so yet. “It's hard to tell if [doing a lot of reviews] might help me with my career” in terms of things like hiring or promotions, she writes, but she cites other benefits. “Doing peer review helps me be a better scientist and writer, so the more I do it, the more I get better at doing research and paper writing myself.”
Not just quantity, but quality
Both Publons and Elsevier’s Reviewer Recognition Platform focus on review quantity. “That’s a good start,” Gosling says, “but I also think actual quality is really important. … Incorporating that will be a bit more challenging.”
One way to begin getting at the question of quality is by considering review length and timeliness, says Lutz Prechelt, a professor of informatics at the Free University of Berlin. “If it was three lines or 300 lines, if it was on time or 5 months late, they’re all worth the same” on the Reviewer Recognition Platform—but not to the journal editors relying on reviewers to help them make their decisions about manuscripts, says Prechelt, who served as an academic adviser to Elsevier’s platform and is working on his own approach to give scientists credit for their review quality.
Enter 4-year-old Peerage of Science, which tries to grade reviewers based on their review quality—although this function is primarily a byproduct of the site’s main goal of streamlining peer review. The site lets scientists submit unpublished manuscripts for anonymous review by Peerage of Science members, or “Peers.” Any scientist with at least one peer-reviewed first author or corresponding author publication can register to become a Peer, and any Peer can review submitted manuscripts, as long as the authors are from a different institution and have not co-authored any studies with the reviewers in the past 3 years. Editors registered with the site can track submitted manuscripts and bid to publish them after they have been reviewed. Authors can also export their manuscript’s final evaluations to their journal of choice. The idea is that manuscripts only need to be reviewed once through Peerage of Science rather than individually by multiple journals.
In addition to reviewing manuscripts, Peers have the opportunity to anonymously evaluate the other reviews for any manuscript they have reviewed themselves, giving the reviews a score to indicate their quality. Reviewers can display these scores on a public profile. “This is [a] number you can also add to your CV to show interested people how good you are at reviewing,” says Jan Engler, a Peer who is a Ph.D. student at the Alexander Koenig Zoological Research Museum in Bonn, Germany.
Top reviewers, determined using a combination of reviewer scores and number of reviews completed, receive rewards such as cash prizes. Engler received the 2014 Peerage of Science Annual Reviewer Prize, consisting of €1000 and a medal, for the high quality and quantity of his reviews. As a side benefit, receiving detailed feedback on the quality of their reviews can help researchers improve their reviewing skills. “Being on Peerage of Science definitely increased my review quality,” Engler says.
What’s the impact?
For the moment, it's unclear what impact these new platforms might have on career decisions such as hiring, tenure, and invitations to participate in other career development opportunities. Some users are optimistic that the metrics could help them advance their careers. Engler, for example, lists both his Peerage of Science prize and a link to his Publons profile on his CV. “I think it’s good additional information for potential employers,” he says. And Gerard Ridgway, a research fellow at the University of Oxford in the United Kingdom, suspects that his Publons profile may have played a role in earning him an invitation to join a journal’s advisory board. “By having a Publons profile, editors across all journals in my field could see that I had a lot of relevant experience,” he writes in an email to Science Careers. Ridgway, who is also a Publons adviser, became a Publons top reviewer shortly after he joined the site in May 2014.
Mick Watson, a computational biologist at the University of Edinburgh in the United Kingdom who blogs about scientific publishing, among other topics, agrees that peer review is severely undervalued and that scientists need to be incentivized to review, and to write better reviews. He isn’t sure, however, whether websites that track reviews will do the trick. “Are people ever going to be trawling Publons and say [that] this guy’s submitted 500 high-quality reviews, he’s a great scientist, I’m going to employ him or it’s going to improve his promotion prospects? I just don’t see that happening—which is a shame,” he says.
Nevertheless, Publons user Krzysztof Gorgolewski, a research associate at Stanford University who has blogged about peer review, is enthusiastic about different ways to change the peer-review and academic-publishing processes. He agrees with Watson that, at the moment, “whether you’re an active reviewer or not doesn’t really play into your career,” but he still thinks that Publons and other similar services could have some value. “Publons keeps a record of you being an active member of the community, and you also show that you’re regarded as an expert by your peers,” he says. “I’m not sure how much it would count right now, but it costs very little, so I don’t see why you wouldn’t do it.”