Six of every 10 articles published in a sample of “predatory" journals attracted not one single citation over a 5-year period, according to a new study. Like many open-access journals, predatory journals charge authors to publish, but they offer little or no peer review or other quality controls and often use aggressive marketing tactics. The new study found that the few articles in predatory journals that received citations did so at a rate much lower than papers in conventional, peer-reviewed journals.
The authors say the finding allays concerns that low-quality or misleading studies published in these journals are getting undue attention. “There is little harm done if nobody reads and, in particular, makes use of such results,” write Bo-Christer Björk of the Hanken School of Economics in Finland and colleagues in a preprint posted 21 December 2019 on arXiv.
But Rick Anderson, an associate dean at the University of Utah who oversees collections in the university’s main library, says the finding that 40% of the predatory journal articles drew at least one citation “strikes me as pretty alarming.”
The number of predatory journals has ballooned in recent years, raising alarm among researchers. Previous studies found that their authors are predominantly in Africa and Asia and that some turn to predatory journals for speed and ease of publication, or to satisfy institutional requirements to publish. But some observers fear these journals enable a proliferation of mediocre or flawed research. Others worry authors may use them to spread misinformation—for example, about climate change or vaccine safety—knowing that reputable refereed journals would not accept it.
Predatory journals are also seen as giving a bad name to journals that publish articles on an open-access (immediately free to read) basis—a format that many scholars would like to see become more common.
Björk and his colleagues randomly selected 250 journals from a list of 10,000 such titles compiled by the information service Cabell’s International. The team then selected one paper in each journal published in 2014. Most papers were in the natural and social sciences. For comparison, the researchers randomly selected 250 articles from 2014 from Elsevier’s Scopus database, which includes science journals that meet quality standards.
Scopus and other widely used citation databases don’t list most predatory journals, so the team used Google Scholar to count citations. Fully 60% of the articles in predatory journals attracted no citations, compared with just 9% of those in the peer-reviewed journals. On average, the predatory articles drew about two citations each, compared with 18 for the traditional papers. Of the predatory articles, 13% attracted just one citation, and only 3% got more than 10. (Because of limited staff resources, the team did not count how many of those citations came from an article’s own authors, a practice some scholars use to inflate their citation counts.)
“It’s great to see this,” says Kelly Cobey, a researcher in clinical epidemiology at the Ottawa Hospital Research Institute. But she notes that variations in citation rates by field might affect the comparisons, and that Cabell’s list may not capture all predatory journals. (Cobey and 34 colleagues from multiple institutions last month proposed a definition of these journals that researchers could apply without relying on a list.)
Björk doesn’t think those factors would change the results much. But he and his colleagues say other studies should examine how often predatory journal articles are referenced outside the scholarly literature, for example on social media. Their study found that among the 17 most cited articles in its sample, none was cited in a Wikipedia entry.
Even if no one reads them, predatory journals are a problem, Cobey says. “I’d argue that the average member of the public doesn’t want their tax dollars going to pay for article-processing charges for predatory journals that may not be read,” she says.