The power of ‘evidence’: Reliable science or
a set of blunt tools?
Northumbria University, UK
In response to the increasing emphasis on ‘evidence-based teaching’, this article examines the privi-
leging of randomised controlled trials and their statistical synthesis (meta-analysis). It also pays par-
ticular attention to two third-level statistical syntheses: John Hattie’s Visible learning project and the
EEF’s Teaching and learning toolkit. The article examines some of the technical shortcomings, philo-
sophical implications and ideological effects of this approach to ‘evidence’, at all these three levels.
At various points in the article, aspects of critical realism are referenced in order to highlight onto-
logical and epistemological shortcomings of ‘evidence-based teaching’ and its implicit empiricism.
Given the invocation of the medical field in this debate, it points to critiques within that field,
including the need to pay attention to professional experience and clinical diagnosis in specific situa-
tions. Finally, it briefly locates the appeal to ‘evidence’ within a neoliberal policy framework.
Keywords: evidence; evidence-based teaching; EBM; randomised controlled trials; meta-analysis;
empiricism; critical realism
The need for professionals to draw on evidence rather than political authority or cus-
tom and practice is not difficult to argue. However, like other feelgood words
(‘Intelligence’, ‘School Effectiveness’, ‘Leadership’, ‘Accountability’, ‘Standards’), it
is important to interrogate the meanings they carry, and specifically within the current
policy framework. Such keywords help produce ideological effects precisely because
they appear beyond question, making it harder to investigate their inflection and
In the case of ‘evidence’—more precisely, evidence-based practice—the ideological
effect is reinforced by the cultural status of numbers in the modern era; numerical
data are presented as objective, unmediated, unbiased and scientific carriers of facts
(Poovey, 1998). In education in particular, under the sway of audit culture (Power,
1997), we have seen an escalation from assessment-based accountability to ‘policy as
numbers’ and ‘governing by numbers’ (Ozga & Lingard, 2007) and now to demands
for ‘evidence-based teaching’.
What now stands proxy for a breadth of evidence is statistical averaging. This math-
ematical abstraction neglects the contribution of the practitioner’s accumulated expe-
rience, a sense of the students’ needs and wishes, and an understanding of social and
cultural context. We see the attempted displacement of a rich array of research by the
*School of Health, Community and Education Studies, Northumbria University, Newcastle-upon-
Tyne NE1 8ST, UK. E-mail: email@example.com. Twitter: @terrywrigley1.
© 2018 British Educational Research Association
British Educational Research Journal
Vol. 44, No. 3, June 2018, pp. 359–376