Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

A framework for designing retrieval effectiveness studies of library information systems using human relevance assessments

A framework for designing retrieval effectiveness studies of library information systems using... PurposeThe purpose of this paper is to demonstrate how to apply traditional information retrieval (IR) evaluation methods based on standards from the Text REtrieval Conference and web search evaluation to all types of modern library information systems (LISs) including online public access catalogues, discovery systems, and digital libraries that provide web search features to gather information from heterogeneous sources.Design/methodology/approachThe authors apply conventional procedures from IR evaluation to the LIS context considering the specific characteristics of modern library materials.FindingsThe authors introduce a framework consisting of five parts: search queries, search results, assessors, testing, and data analysis. The authors show how to deal with comparability problems resulting from diverse document types, e.g., electronic articles vs printed monographs and what issues need to be considered for retrieval tests in the library context.Practical implicationsThe framework can be used as a guideline for conducting retrieval effectiveness studies in the library context.Originality/valueAlthough a considerable amount of research has been done on IR evaluation, and standards for conducting retrieval effectiveness studies do exist, to the authors’ knowledge this is the first attempt to provide a systematic framework for evaluating the retrieval effectiveness of twenty-first-century LISs. The authors demonstrate which issues must be considered and what decisions must be made by researchers prior to a retrieval test. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Journal of Documentation Emerald Publishing

A framework for designing retrieval effectiveness studies of library information systems using human relevance assessments

Journal of Documentation , Volume 73 (3): 19 – May 8, 2017

Loading next page...
 
/lp/emerald-publishing/a-framework-for-designing-retrieval-effectiveness-studies-of-library-pdyQoTjYhV

References (64)

Publisher
Emerald Publishing
Copyright
Copyright © Emerald Group Publishing Limited
ISSN
0022-0418
DOI
10.1108/JD-08-2016-0099
Publisher site
See Article on Publisher Site

Abstract

PurposeThe purpose of this paper is to demonstrate how to apply traditional information retrieval (IR) evaluation methods based on standards from the Text REtrieval Conference and web search evaluation to all types of modern library information systems (LISs) including online public access catalogues, discovery systems, and digital libraries that provide web search features to gather information from heterogeneous sources.Design/methodology/approachThe authors apply conventional procedures from IR evaluation to the LIS context considering the specific characteristics of modern library materials.FindingsThe authors introduce a framework consisting of five parts: search queries, search results, assessors, testing, and data analysis. The authors show how to deal with comparability problems resulting from diverse document types, e.g., electronic articles vs printed monographs and what issues need to be considered for retrieval tests in the library context.Practical implicationsThe framework can be used as a guideline for conducting retrieval effectiveness studies in the library context.Originality/valueAlthough a considerable amount of research has been done on IR evaluation, and standards for conducting retrieval effectiveness studies do exist, to the authors’ knowledge this is the first attempt to provide a systematic framework for evaluating the retrieval effectiveness of twenty-first-century LISs. The authors demonstrate which issues must be considered and what decisions must be made by researchers prior to a retrieval test.

Journal

Journal of DocumentationEmerald Publishing

Published: May 8, 2017

There are no references for this article.