Access the full text.
Sign up today, get DeepDyve free for 14 days.
Louise Watson (2008)
Developing Indicators for a New ERA: Should we Measure the Policy Impact of Education Research?Australian Journal of Education, 52
B. Cronin, Lokman Meho (2006)
Using the h-index to rank influential information scientistsJ. Assoc. Inf. Sci. Technol., 57
Michael Norris, C. Oppenheim (2010)
Peer review and the h-index: Two studiesJ. Informetrics, 4
P. Jacsó (2008)
The pros and cons of computing the h-index using Web of ScienceOnline Inf. Rev., 32
P. Jacsó (2009)
Database source coverage: hypes, vital signs and reality checksOnline Inf. Rev., 33
Lokman Meho, Kiduk Yang (2007)
Impact of data sources on citation counts and rankings of LIS faculty: Web of science versus scopus and google scholarJ. Assoc. Inf. Sci. Technol., 58
C. Oppenheim (1996)
Do Citations Count? Citation Indexing and the Research Assessment Exercise (RAE)Serials: The Journal for The Serials Community, 9
T. Braun, W. Glänzel, A. Schubert (2006)
A Hirsch-type index for journalsScientometrics, 69
P. Jacsó (2009)
Errors of omission and their implications for computing scientometric measures in evaluating the publishing productivity and impact of countriesOnline Inf. Rev., 33
G. Prathap
Hirsch‐type indices for ranking institutions' scientific research output
H. Moed (2008)
UK Research Assessment Exercises: Informed judgments on research quality or quantity?Scientometrics, 74
P. Jacsó (1997)
Content Evaluation of Databases., 32
T. Lazaridis (2010)
Ranking university departments using the mean h-indexScientometrics, 82
L. Egghe (2006)
An improvement of the h-index: the g-index, 2
J. Hirsch (2005)
An index to quantify an individual's scientific research outputProceedings of the National Academy of Sciences of the United States of America, 102 46
P. Jacsó (2009)
Calculating the h-index and other bibliometric and scientometric indicators from Google Scholar with the Publish or Perish softwareOnline Inf. Rev., 33
P. Jacsó (2008)
The pros and cons of computing the h-index using Google ScholarOnline Inf. Rev., 32
P. Jacsó (2007)
How big is a database versus how is a database bigOnline Inf. Rev., 31
L.I. Meho, Y. Rogers
Citation counting, citation ranking, and h‐index of human‐computer interaction researchers: a comparison of Scopus and Web of Science
J. Vanclay (2007)
On the robustness of the h-indexJ. Assoc. Inf. Sci. Technol., 58
Lokman Meho, Y. Rogers (2008)
A proposal for a dynamic h-type indexJournal of the Association for Information Science and Technology, 59
M. García-Pérez (2010)
Accuracy and completeness of publication and citation records in the Web of Science, PsycINFO, and Google Scholar: A case study for the computation of h indices in PsychologyJ. Assoc. Inf. Sci. Technol., 61
P. Jacsó (2009)
The h-index for countries in Web of Science and ScopusOnline Inf. Rev., 33
P. Jacsó (2008)
Testing the Calculation of a Realistic h-index in Google Scholar, Scopus, and Web of Science for F. W. LancasterLibrary Trends, 56
P. Jacsó
Google Scholar's ghost authors and lost authors
P. Jacsó (2010)
Pragmatic issues in calculating and comparing the quantity and quality of research through rating and ranking of researchers based on peer reviews and bibliometric indicators from Web of Science, Scopus and Google ScholarOnline Information Review, 34
C. Oppenheim (2007)
Using the h-index to rank influential British researchers in information science and librarianshipJ. Assoc. Inf. Sci. Technol., 58
P. Jacsó (2010)
Metadata mega mess in Google ScholarOnline Inf. Rev., 34
M. Henzinger, Jacob Suñol, Ingmar Weber (2009)
The stability of the h-indexScientometrics, 84
E. Garfield (2006)
Citation indexes for science. A new dimension in documentation through association of ideasInternational Journal of Epidemiology, 35
Massimo Franceschet (2010)
A comparison of bibliometric indicators for computer science scholars and journals on Web of Science and Google ScholarScientometrics, 83
P. Jacsó (2008)
The plausibility of computing the h-index of scholarly productivity and impact using reference-enhanced databasesOnline Inf. Rev., 32
Jonathan Levitt, M. Thelwall (2007)
The most highly cited Library and Information Science articles: Interdisciplinarity, first authors and citation patternsScientometrics, 78
P. Jacsó (2012)
Google Scholar Metrics for PublicationsOnline Information Review, 36
J. Bar-Ilan (2010)
Rankings of information and library science journals by JIF and by h-type indicesJ. Informetrics, 4
P. Jacsó (2011)
The h‐index, h‐core citation rate and the bibliometric profile of the Scopus databaseOnline Information Review, 35
P. Jacsó (2008)
The pros and cons of computing the h-index using ScopusOnline Inf. Rev., 32
Jiang Li, M. Sanderson, P. Willett, Michael Norris, C. Oppenheim (2010)
Ranking of library and information science researchers: Comparison of data sources for correlating citation data, and expert judgmentsJ. Informetrics, 4
P. Jacsó (2007)
The dimensions of cited reference enhanced database subsetsOnline Inf. Rev., 31
Purpose – The purpose of this paper is to discuss the new version of the Web of Science (WoS) software. Design/methodology/approach – This paper discusses the new version of the Web of Science (WoS) software. Findings – The new version of the Web of Science (WoS) software released in mid‐2011 eliminated the 100,000‐record limit in the search results. This, in turn, makes it possible to study the bibliometric profile of the entire WoS database (which consists of 50 million unique records), and/or any subset licensed by a library. In addition the maximum record set for the automatic production of the informative citation report was doubled from 5,000 to 10,000 records. These are important developments for getting a realistic picture of WoS, and gauging the most widely used gauge. It also helps in comparing WoS with the Scopus database using traceable and reproducible quantitative measures, including the h‐index and its variants, the citation rate of the documents making up the h‐core (the set of records that contribute to the h‐index), and computing additional bibliometric indicators that can be used as proxies in evaluating the research performance of individuals, research groups, educational and research institutions as well as serial publications for the broadest subject areas and time span – although with some limitations and reservations. Originality/value – This paper, which attempts to describe some of the bibliometric traits of WoS in three different configurations (in terms of the composition and time span of the components licensed), complements the one published in a previous issue of Online Information Review profiling the Scopus database.
Online Information Review – Emerald Publishing
Published: Sep 27, 2011
Keywords: Databases; Research
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.