عمومی | science mag

Firm that tallies controversial journal impact scores moves to provide more context

Stung by years of criticism that its journal impact factors have distorted scholarly publishing, the private firm Clarivate Analytics based in Philadelphia, Pennsylvania, this week rolled out an updated version of its Journal Citation Reports database that it says provides context useful to understanding journals’ characteristics and audiences.

Impact factors—which represent the number of citations to a journal's articles divided by the number of articles published during a 2-year period—are widely used in academe as a yardstick of a journal’s prestige and reach. But the metric has plenty of critics. The rap includes worries that editors can too easily boost their journal’s ranking through a variety of strategies, and that impact factors are misleading—a few highly cited papers can drive much of a journal’s overall impact factor.

Although Clarivate—which has offices in the United States, the United Kingdom, Japan, and China—continues to publish journal impact factors in its Journal Citation Review (JCR) database, the latest version, released 26 June, contains supplementary information that addresses some of this criticism. Most prominently, the page showing a journal’s impact factor now includes a distribution curve displaying the total number of articles and other items published in a journal versus the number of times each item was cited. The median number of citations for all of the journal’s research articles and review articles is also identified on the curve.

Including these graphs alongside impact factors is “clearly a step in the right direction,” says Stephen Curry, a structural biologist at Imperial College London. Because impact factors measure the average citation performance of papers in a journal, they tend to be driven up by a small number of highly cited papers and don’t reveal anything about the spread of citations across all a journal’s papers. In contrast, the distribution graphs give researchers a much better understanding of how often individual papers in a journal are actually cited than can be provided by a single number. (In a 2016 preprint, Curry and colleagues, including then–Editor-in-Chief of Science Marcia McNutt, called on journals to publish these kinds of distribution graphs in an effort to increase transparency.)

Clarivate’s update also includes a number of other changes that together are meant to “give you a much more nuanced picture of what that journal contributes” to scholarly communication than the impact factor alone can convey, Marie McVeigh, product director of JCR, told Science Insider.

Users can drill down into the underlying data to see, for example, the titles of the most highly cited items and, in a separate list, the citations and articles that went into the calculation of the journal’s impact factor.

The dashboard also displays summary information characterizing a journal’s citations by type of article. This allows users to see, for example, what proportion came from research articles versus review articles. Another chart shows how the journal’s impact factor has fluctuated over recent years.

There is also summary information about the journal’s authors—tables group them by country and institution. That should interest other authors looking to pitch their manuscripts to journals that serve diverse international constituencies, McVeigh said. In all, the database tracks 11,655 journals.

Overall, McVeigh said, Clarivate is “trying to pull this back from an obsessive use of the JIF … and support and allow this more contextualized use of this number that we’ve been producing now for 44 years. It’s been rather a shame to see so much rich, valuable data be thrown away just to look at that one number. Well, let’s make this data-rich and valuable and visible.”

The new tools provide researchers with better insight into individual journals, says John Tregoning, an immunologist at Imperial College London who recently wrote an opinion piece in Nature on the benefits and drawbacks of impact factors. This information could be useful, for example, when deciding where to submit a paper. But he cautions that the impact factors and related metrics published by Clarivate remain measures of the journals themselves, and should not be used by grant agencies or employers to make judgments about the people who publish in those journals. “This cannot be used as a proxy for quality of … individual people or individual work,” he says. “There is no … single number that says person A is better than person B. It has to be about a judgment on the quality of their published science.”

Curry says he would now like to see journals print the citation distribution graphs made available by Clarivate. After publishing his 2016 preprint, a handful of journals began to do so—including Nature and the Proceedings of the National Academy of Sciences —but not as many as he’d have liked. “Hopefully this will give a big boost to that,” Curry says. “If it’s provided to you ready-made, why wouldn’t you want to be transparent about citation performance?”