Free journal-ranking tool enters citation market

Free journal-ranking tool enters citation market A new Internet database lets users generate on-the-fly citation statistics of published research papers for free. The tool also calculates papers' impact factors using a new algorithm similar to PageRank, the algorithm Google uses to rank web pages. The open-access database is collaborating with Elsevier, the giant Amsterdam-based science publisher, and its underlying data come from Scopus, a subscription abstracts database created by Elsevier in 2004. The SCImago Journal & Country Rank database was launched in December by SCImago, a data-mining and visualization group at the universities of Granada, Extremadura, Carlos III and Alcalá de Henares, all in Spain. It ranks journals and countries using such citation metrics as the popular, if controversial, Hirsch Index. It also includes a new metric: the SCImago Journal Rank (SJR). The familiar impact factor created by industry leader Thomson Scientific, based in Philadelphia, Pennsylvania, is calculated as the average number of citations by the papers that each journal contains. The SJR also analyses the citation links between journals in a series of iterative cycles, in the same way as the Google PageRank algorithm. This means not all citations are considered equal; those coming from journals with higher SJRs are given more weight. The main difference between SJR and Google's PageRank is that SJR uses a citation window of three years. See It will take time to assess the SJR properly, experts say. It is difficult to compare the results of SJR journal analyses directly with those based on impact factors, because the databases each is based on are different. Thomson's Web of Science abstracts database covers around 9,000 journals and Scopus more than 15,000, and in the years covered by the SCImago database — 1996 to 2007 — Scopus contains 20–45% more records, says Félix de Moya Anegón, a researcher at the SCImago group. The top journals in SJR rankings by discipline are often broadly similar to those generated by impact factors, but there are also large differences in position (see ). Immunity (SJR of 9.34) scores higher than The Lancet (1.65), for example, but The Lancet 's 2006 impact factor of 25.8 is higher than the 18.31 of Immunity . Such differences can be understood in terms of popularity versus prestige, says de Moya Anegón — popular journals cited frequently by journals of low prestige have high impact factors and low SJRs, whereas journals that are prestigious may be cited less but by more prestigious journals, giving them high SJRs but lower impact factors. Thomson under fire The new rankings are welcomed by Carl Bergstrom of the University of Washington in Seattle, who works on a similar citation index, the Eigenfactor, using Thomson data. "It's yet one more confirmation of the importance and timeliness of a new generation of journal ranking systems to take us beyond the impact factor," says Bergstrom, "and another vote in favour of the basic idea of ranking journals using the sorts of Eigenvector centrality methods that Google's PageRank uses." Thomson has enjoyed a monopoly on citation numbers for years — its subscription products include the Web of Science, the Journal Citation Report and Essential Science Indicators. "Given the dominance of Thomson in this field it is very welcome to have journal indicators based on an alternative source, Scopus," says Anne-Wil Harzing of the University of Melbourne in Australia, who is developing citation metrics based on Google Scholar. Jim Pringle, vice-president for development at Thomson, says metrics similar to PageRank, such as SJR and Eigenfactor, have proven their utility on the web, but their use for evaluating science is less well understood. "Both employ complex algorithms to create relative measures and may seem opaque to the user and difficult to interpret," he says. Thomson is also under fire from researchers who want greater transparency over how citation metrics are calculated and the data sets used. In a hard-hitting editorial published in Journal of Cell Biology in December, Mike Rossner, head of Rockefeller University Press, and colleagues say their analyses of databases supplied by Thomson yielded different values for metrics from those published by the company (M. Rossner et al . J. Cell Biol. 179, 1091–1092 ; 2007). Moreover, Thomson, they claim, was unable to supply data to support its published impact factors. "Just as scientists would not accept the findings in a scientific paper without seeing the primary data," states the editorial, "so should they not rely on Thomson Scientific's impact factor, which is based on hidden data." Citation metrics produced by both academics and companies are often challenged, says Pringle. The editorial, he claims, "misunderstands much, and misstates several matters", including the authors' exchanges with Thomson on the affair. On 1 January, the company launched a web forum to formally respond to the editorial (see http://scientific.thomson.com/citationimpactforum ). http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Nature Springer Journals

Free journal-ranking tool enters citation market

Nature, Volume 451 (7174) – Jan 2, 2008

Loading next page...
 
/lp/springer-journals/free-journal-ranking-tool-enters-citation-market-BALBAuJG7q
Publisher
Springer Journals
Copyright
Copyright © 2008 Nature Publishing Group
ISSN
0028-0836
eISSN
1476-4687
D.O.I.
10.1038/451006a
Publisher site
See Article on Publisher Site

Abstract

A new Internet database lets users generate on-the-fly citation statistics of published research papers for free. The tool also calculates papers' impact factors using a new algorithm similar to PageRank, the algorithm Google uses to rank web pages. The open-access database is collaborating with Elsevier, the giant Amsterdam-based science publisher, and its underlying data come from Scopus, a subscription abstracts database created by Elsevier in 2004. The SCImago Journal & Country Rank database was launched in December by SCImago, a data-mining and visualization group at the universities of Granada, Extremadura, Carlos III and Alcalá de Henares, all in Spain. It ranks journals and countries using such citation metrics as the popular, if controversial, Hirsch Index. It also includes a new metric: the SCImago Journal Rank (SJR). The familiar impact factor created by industry leader Thomson Scientific, based in Philadelphia, Pennsylvania, is calculated as the average number of citations by the papers that each journal contains. The SJR also analyses the citation links between journals in a series of iterative cycles, in the same way as the Google PageRank algorithm. This means not all citations are considered equal; those coming from journals with higher SJRs are given more weight. The main difference between SJR and Google's PageRank is that SJR uses a citation window of three years. See It will take time to assess the SJR properly, experts say. It is difficult to compare the results of SJR journal analyses directly with those based on impact factors, because the databases each is based on are different. Thomson's Web of Science abstracts database covers around 9,000 journals and Scopus more than 15,000, and in the years covered by the SCImago database — 1996 to 2007 — Scopus contains 20–45% more records, says Félix de Moya Anegón, a researcher at the SCImago group. The top journals in SJR rankings by discipline are often broadly similar to those generated by impact factors, but there are also large differences in position (see ). Immunity (SJR of 9.34) scores higher than The Lancet (1.65), for example, but The Lancet 's 2006 impact factor of 25.8 is higher than the 18.31 of Immunity . Such differences can be understood in terms of popularity versus prestige, says de Moya Anegón — popular journals cited frequently by journals of low prestige have high impact factors and low SJRs, whereas journals that are prestigious may be cited less but by more prestigious journals, giving them high SJRs but lower impact factors. Thomson under fire The new rankings are welcomed by Carl Bergstrom of the University of Washington in Seattle, who works on a similar citation index, the Eigenfactor, using Thomson data. "It's yet one more confirmation of the importance and timeliness of a new generation of journal ranking systems to take us beyond the impact factor," says Bergstrom, "and another vote in favour of the basic idea of ranking journals using the sorts of Eigenvector centrality methods that Google's PageRank uses." Thomson has enjoyed a monopoly on citation numbers for years — its subscription products include the Web of Science, the Journal Citation Report and Essential Science Indicators. "Given the dominance of Thomson in this field it is very welcome to have journal indicators based on an alternative source, Scopus," says Anne-Wil Harzing of the University of Melbourne in Australia, who is developing citation metrics based on Google Scholar. Jim Pringle, vice-president for development at Thomson, says metrics similar to PageRank, such as SJR and Eigenfactor, have proven their utility on the web, but their use for evaluating science is less well understood. "Both employ complex algorithms to create relative measures and may seem opaque to the user and difficult to interpret," he says. Thomson is also under fire from researchers who want greater transparency over how citation metrics are calculated and the data sets used. In a hard-hitting editorial published in Journal of Cell Biology in December, Mike Rossner, head of Rockefeller University Press, and colleagues say their analyses of databases supplied by Thomson yielded different values for metrics from those published by the company (M. Rossner et al . J. Cell Biol. 179, 1091–1092 ; 2007). Moreover, Thomson, they claim, was unable to supply data to support its published impact factors. "Just as scientists would not accept the findings in a scientific paper without seeing the primary data," states the editorial, "so should they not rely on Thomson Scientific's impact factor, which is based on hidden data." Citation metrics produced by both academics and companies are often challenged, says Pringle. The editorial, he claims, "misunderstands much, and misstates several matters", including the authors' exchanges with Thomson on the affair. On 1 January, the company launched a web forum to formally respond to the editorial (see http://scientific.thomson.com/citationimpactforum ).

Journal

NatureSpringer Journals

Published: Jan 2, 2008

There are no references for this article.

You’re reading a free preview. Subscribe to read the entire article.


DeepDyve is your
personal research library

It’s your single place to instantly
discover and read the research
that matters to you.

Enjoy affordable access to
over 18 million articles from more than
15,000 peer-reviewed journals.

All for just $49/month

Explore the DeepDyve Library

Search

Query the DeepDyve database, plus search all of PubMed and Google Scholar seamlessly

Organize

Save any article or search result from DeepDyve, PubMed, and Google Scholar... all in one place.

Access

Get unlimited, online access to over 18 million full-text articles from more than 15,000 scientific journals.

Your journals are on DeepDyve

Read from thousands of the leading scholarly journals from SpringerNature, Elsevier, Wiley-Blackwell, Oxford University Press and more.

All the latest content is available, no embargo periods.

See the journals in your area

DeepDyve

Freelancer

DeepDyve

Pro

Price

FREE

$49/month
$360/year

Save searches from
Google Scholar,
PubMed

Create folders to
organize your research

Export folders, citations

Read DeepDyve articles

Abstract access only

Unlimited access to over
18 million full-text articles

Print

20 pages / month

PDF Discount

20% off