Purpose – The purpose of this paper is to examine the way in which library and information science (LIS) issues have been handled in the formulation of recent UK Higher Education policy concerned with research quality evaluation. Design/methodology/approach – A chronological review of decision making about digital rights arrangements for the 2008 Research Assessment Exercise (RAE), and of recent announcements about the new shape of metrics‐based assessment in the Research Excellence Framework, which supersedes the RAE. Against this chronological framework, the likely nature of LIS practitioner reactions to the flow of decision making is suggested. Findings – It was found that a weak grasp of LIS issues by decision makers undermines the process whereby effective research evaluation models are created. LIS professional opinion should be sampled before key decisions are made. Research limitations/implications – This paper makes no sophisticated comments on the complex research issues underlying advanced bibliometric research evaluation models. It does point out that sophisticated and expensive bibliometric consultancies arrive at many conclusions about metrics‐based research assessment that are common knowledge amongst LIS practitioners. Practical implications – Practical difficulties arise when one announces a decision to move to a new and specific type of research evaluation indicator before one has worked out anything very specific about that indicator. Originality/value – In this paper, the importance of information management issues to the mainstream issues of government and public administration is underlined. The most valuable conclusion of this paper is that, because LIS issues are now at the heart of democratic decision making, LIS practitioners and professionals should be given some sort of role in advising on such matters.
Library Review – Emerald Publishing
Published: May 23, 2008
Keywords: Information operations; Research; Performance measures; Quality assessment; United Kingdom; Universities