A new Internet database lets users generate on-the-fly citation statistics of published research papers for free. The tool also calculates papers' impact factors using a new algorithm similar to PageRank, the algorithm Google uses to rank web pages. The open-access database is collaborating with Elsevier, the giant Amsterdam-based science publisher, and its underlying data come from Scopus, a subscription abstracts database created by Elsevier in 2004.

The SCImago Journal & Country Rank database was launched in December by SCImago, a data-mining and visualization group at the universities of Granada, Extremadura, Carlos III and Alcalá de Henares, all in Spain. It ranks journals and countries using such citation metrics as the popular, if controversial, Hirsch Index. It also includes a new metric: the SCImago Journal Rank (SJR).

The familiar impact factor created by industry leader Thomson Scientific, based in Philadelphia, Pennsylvania, is calculated as the average number of citations by the papers that each journal contains. The SJR also analyses the citation links between journals in a series of iterative cycles, in the same way as the Google PageRank algorithm. This means not all citations are considered equal; those coming from journals with higher SJRs are given more weight. The main difference between SJR and Google's PageRank is that SJR uses a citation window of three years. See

It will take time to assess the SJR properly, experts say. It is difficult to compare the results of SJR journal analyses directly with those based on impact factors, because the databases each is based on are different. Thomson's Web of Science abstracts database covers around 9,000 journals and Scopus more than 15,000, and in the years covered by the SCImago database — 1996 to 2007 — Scopus contains 20–45% more records, says Félix de Moya Anegón, a researcher at the SCImago group.

The top journals in SJR rankings by discipline are often broadly similar to those generated by impact factors, but there are also large differences in position (see ). Immunity (SJR of 9.34) scores higher than The Lancet (1.65), for example, but The Lancet's 2006 impact factor of 25.8 is higher than the 18.31 of Immunity. Such differences can be understood in terms of popularity versus prestige, says de Moya Anegón — popular journals cited frequently by journals of low prestige have high impact factors and low SJRs, whereas journals that are prestigious may be cited less but by more prestigious journals, giving them high SJRs but lower impact factors.

Thomson under fire

The new rankings are welcomed by Carl Bergstrom of the University of Washington in Seattle, who works on a similar citation index, the Eigenfactor, using Thomson data. "It's yet one more confirmation of the importance and timeliness of a new generation of journal ranking systems to take us beyond the impact factor," says Bergstrom, "and another vote in favour of the basic idea of ranking journals using the sorts of Eigenvector centrality methods that Google's PageRank uses."

Thomson has enjoyed a monopoly on citation numbers for years — its subscription products include the Web of Science, the Journal Citation Report and Essential Science Indicators. "Given the dominance of Thomson in this field it is very welcome to have journal indicators based on an alternative source, Scopus," says Anne-Wil Harzing of the University of Melbourne in Australia, who is developing citation metrics based on Google Scholar.

Jim Pringle, vice-president for development at Thomson, says metrics similar to PageRank, such as SJR and Eigenfactor, have proven their utility on the web, but their use for evaluating science is less well understood. "Both employ complex algorithms to create relative measures and may seem opaque to the user and difficult to interpret," he says.

Thomson is also under fire from researchers who want greater transparency over how citation metrics are calculated and the data sets used. In a hard-hitting editorial published in Journal of Cell Biology in December, Mike Rossner, head of Rockefeller University Press, and colleagues say their analyses of databases supplied by Thomson yielded different values for metrics from those published by the company (M. Rossner et al. J. Cell Biol. 179, 1091–1092 ; 2007).

Moreover, Thomson, they claim, was unable to supply data to support its published impact factors. "Just as scientists would not accept the findings in a scientific paper without seeing the primary data," states the editorial, "so should they not rely on Thomson Scientific's impact factor, which is based on hidden data."

Citation metrics produced by both academics and companies are often challenged, says Pringle. The editorial, he claims, "misunderstands much, and misstates several matters", including the authors' exchanges with Thomson on the affair. On 1 January, the company launched a web forum to formally respond to the editorial (see http://scientific.thomson.com/citationimpactforum).