TY - JOUR AU - Rossi, Federica AB - Abstract Using data from the UK, this study explores the institutional and environmental factors that influence universities’ efficiency in knowledge transfer. While studies of universities’ knowledge transfer performance have so far focused on patent commercialisation and research contracting with industry, it is increasingly acknowledged that universities engage in a broader range of knowledge transfer activities, including consulting, public engagement and provision of knowledge-intensive services. When these are taken into account, less research-intensive universities, and those with a greater share of staff in the arts and humanities, improve their relative efficiency. More specialised, older and larger institutions are more efficient performers, while research intensity is no longer a strong predictor of efficiency. 1. Introduction Universities are increasingly encouraged to transfer knowledge to stakeholders in business, government and society more generally (Grady and Pratt, 2000; Vorley and Lawton Smith, 2007; Nelles and Vorley, 2010) in order to facilitate spillovers of knowledge, and thus more effectively contribute to economic growth. Engagement in knowledge transfer has been institutionalised as a third mission for universities (Lockett et al., 2014; Pinheiro et al., 2015), as important as their long-standing commitment to teaching and research (Etzkowitz, 2002; Lawton Smith, 2007). Institutionalisation has largely been driven by policy incentives (Sánchez-Barrioluengo, 2014; Pinheiro et al., 2015). Since the late 1990s, policymakers in many countries have encouraged universities to commercialise research results through patents and spinoff companies (Mowery and Sampat, 2005; Geuna and Rossi, 2011), and to collaborate with industry (Bozeman, 2000; Perkmann et al., 2011), through measures such as the provision of knowledge transfer support infrastructures (as in Denmark, Norway and Sweden; Bourelos et al., 2012), the benchmarking of universities’ knowledge transfer performance through surveys (as in the USA, Canada and Australia; Jensen et al., 2009; Association of University Technology Managers, 2014), and the allocation of funds based on universities’ knowledge transfer performance (as in the UK; Rossi and Rosli, 2015). Consequently, clearer understanding of the factors that drive good knowledge transfer performance is important both to university managers, who can appropriately adjust institutional policies, practices and structures (Berbegal Mirabent and Solé Parellada, 2012; Curi et al., 2012), and to policymakers, who can devise ways to encourage universities’ performance, for example through the introduction of suitable incentive systems (Ranga and Etzkowitz, 2013). Knowledge transfer performance has many dimensions. While the current policy debate mainly concerns the development of metrics to precisely and comprehensively capture knowledge transfer outputs (Guerrero et al., 2015), another important performance dimension is what management research calls ‘efficiency’ (Drucker, 1977; Griffin, 1987): ‘how well’ universities use their resources to produce outputs, rather than ‘how much’ of these outputs they produce. Universities largely finance their knowledge transfer activities with the same resources used to support their research and teaching missions:1 although a substantial literature has investigated efficiency in research and teaching, there is still limited understanding of how to measure efficiency in knowledge transfer and of what factors underpin universities’ efficient performance in this domain. Moreover, most studies of knowledge transfer efficiency focus on the embedding of university research findings into intellectual property (IP) protection instruments such as patents, or on their commercialisation through licenses (Grady and Pratt, 2000). International empirical evidence has shown, however, that patenting and licensing are highly concentrated (mainly in chemistry, pharmacy, biotechnology, information technology and engineering: Levin, 1986; Harabi, 1995), and that only a few institutions file many patents (Henderson et al., 1998) and profit from their commercialisation (Bulut and Moschini, 2006). Most universities instead engage with external stakeholders through other channels (Laursen and Salter, 2004; D’Este and Patel, 2007; Bekkers and Bodas Freitas, 2008). Hughes and Kitson (2012) distinguish between knowledge transfer interactions that are people-based (student placements, employee training, standard-setting forums, network participation), community based (public exhibitions, community lectures), or aimed at problem-solving (contract research, use of physical facilities, personal and informal advice), commercialisation (patents, licenses, spinoff companies) or consulting. Montesinos et al. (2008) categorise knowledge transfer activities according to their objectives and intended customers (generating income by providing services; transferring technology to businesses; benefitting society). Other classifications distinguish between the provision of research outputs, consulting or education services (Jones-Evans and Klofsten, 2000; Arundel and Geuna, 2004; De Silva et al., 2012). Relying upon data from the UK, this study explores the institutional and environmental determinants of universities’ efficiency in performing a broad range of knowledge transfer activities, and analyses the implications of such broader approach to efficiency measurement. It contributes to a multifaceted debate on what ‘good’ knowledge transfer performance means; how universities’ relative performance can be assessed; and what factors help universities achieve better performance. The UK provides an interesting case for several reasons. First, this is one of the few countries where extensive data on universities’ knowledge transfer activities is collected systematically. The annual Higher Education Business and Community Interaction survey (HEBCI), managed by the UK’s Higher Education Statistics Agency (HESA), collects data on universities’ engagement in numerous knowledge transfer activities, which are comparable across institutions and over time. Second, the UK has pioneered several national policy instruments that aim to encourage knowledge transfer by rewarding universities’ performance. The Higher Education Innovation Fund, introduced in 2001, allocates funds to universities based on the systematic assessment of their knowledge transfer performance, relying on data from the HEBCI survey. Since 2014, the Research Excellence Framework distributes research funds to universities partly on the basis of an assessment of the socioeconomic impact of their academic research; while primarily directed at improving research quality, this exercise may incentivise universities to more intensively engage in knowledge transfer as a means to achieve broader impact. Policymakers in other countries may look to the UK’s innovative policies for guidance in the development of their own approaches (Jensen et al., 2009). The paper is structured as follows. The next section briefly reviews the literature on universities’ efficiency in knowledge transfer, and explores some of the issues that arise when extending the analysis to a broader range of activities. Section 3 introduces the data and the methodology. The results of the empirical analysis are presented in Section 4, while Section 5 draws some conclusions and implications for management and policy. 2. The efficiency of universities’ knowledge transfer activities: broadening the framework 2.1. Current approaches to measuring knowledge transfer efficiency In the last two decades, a growing literature has investigated the efficiency of university institutions: Berbegal Mirabent and Solé Parellada (2012), limiting their search to those studies that measured efficiency using non-parametric techniques, identified 45 articles produced between 2001 and 2010. Within this area of investigation, most studies focus on the efficiency of teaching and/or research activities, and only a subset analyse efficient knowledge transfer performance. The latter stream of literature is fragmented and empirically oriented; data are sourced from different countries, and efficiency is measured with different techniques (Siegel et al., 2007). Most of these contributions seek to understand the factors driving the efficient performance of knowledge transfer offices (KTOs). In particular, they examine which KTOs best convert the inputs they work with (e.g. new knowledge, new discoveries) into outputs (e.g. licenses) given their resources (staff, funding), and the impact of several institutional and environmental factors on the KTOs’ efficiency. Efficiency studies are usually based on a production function framework, where a frontier of efficient combinations of inputs and outputs is constructed empirically and an organisational unit’s technical inefficiency (inability to produce the maximum amount of output given one’s inputs, or inability to minimise the use of inputs given one’s output) is measured in terms of distance from the frontier. Such a frontier can be estimated parametrically using stochastic frontier estimation (SFE; Aigner et al., 1977; Meeusen and Van den Broeck, 1977) or non-parametrically using data envelopment analysis (DEA; Charnes et al., 1978). Table 1 summarises the approaches adopted by studies that investigate the efficiency of universities’ KTOs.2 The methods used include SFE, DEA and regression analyses on several knowledge transfer outputs. The transformation process that is modelled comprises either the commercialisation of research results (Siegel et al., 2003; Chapple et al., 2005; Ken et al., 2009), the embedding of university research findings into IP (Curi et al., 2012), the engagement in research contracting or a combination of these (Rogers et al., 2000; Thursby and Kemp, 2002; Anderson et al., 2007; Caldera and Debande, 2010; Berbegal Mirabent et al., 2013; Ho et al., 2014). Hence, the range of knowledge transfer outputs considered includes invention disclosures, patents applied for and granted and licenses issued, with a few studies also adding research agreements and spinoff companies. Table 1. Studies on the efficiency of universities’ knowledge transfer operations Study  Focus  Method  Inputs  Outputs  Rogers, Yin and Hoffmann (2000)  131 US universities (1996)  Regression    Number of invention disclosures, number of U.S. patents filed, number of licenses, number of start-up companies, gross licensing income  Thursby and Kemp (2002)   112 US universities (1991-1996)  DEA  Number of KTO staff, amount of government funds received, number and quality of faculty in several subjects  Sponsored research agreements with industry, number of licenses to private sector firms, royalty payments received, number of invention disclosures, university patent applications  Siegel, Waldman and Link (2003)  113 US universities (1991–1996)  SFE  Number of invention disclosures, number of KTO employees, legal expenditures  Number of licences or licensing income  Chapple et al. (2005)  50 UK universities (2002)  SFE and DEA  Number of invention disclosures, total research income, number of KTO staff, external legal intellectual property expenditure  Number of licences or licensing income  Anderson, Daim and Lavoie (2007) and Kim, Anderson and Daim (2008)  54 US universities (2001-2004)  DEA  Total research spending  Licensing income, licenses and options executed, startup companies, patents filed, patents issued  Ken et al. (2009)  94 US universities (2006)  DEA  Research expenditures, published articles, patents issued and invention disclosures  Licensing income  Caldera and Debande (2010)  52 Spanish universities (2001-2005)  Regression    R&D contracts income, number of R&D contracts, licensing income, number of licensing agreements and number of spin-offs  Curi, Daraio and Llerena (2012)  51 French universities (2003-2007)  DEA  Number of full time equivalent KTO employees, number of publications of the university  Patent applications, software applications  Berbegal-Mirabent, Lafuente and Solé (2013)  44 Spanish universities (2006-2009)  DEA  Total faculty, administrative staff, administrative expenses, R&D income  Graduates, number of papers published, number of spin offs created  Ho et al. (2014)  US universities  DEA  Fundin g resources  Patenting activities, licensing and entrepreneurship  Study  Focus  Method  Inputs  Outputs  Rogers, Yin and Hoffmann (2000)  131 US universities (1996)  Regression    Number of invention disclosures, number of U.S. patents filed, number of licenses, number of start-up companies, gross licensing income  Thursby and Kemp (2002)   112 US universities (1991-1996)  DEA  Number of KTO staff, amount of government funds received, number and quality of faculty in several subjects  Sponsored research agreements with industry, number of licenses to private sector firms, royalty payments received, number of invention disclosures, university patent applications  Siegel, Waldman and Link (2003)  113 US universities (1991–1996)  SFE  Number of invention disclosures, number of KTO employees, legal expenditures  Number of licences or licensing income  Chapple et al. (2005)  50 UK universities (2002)  SFE and DEA  Number of invention disclosures, total research income, number of KTO staff, external legal intellectual property expenditure  Number of licences or licensing income  Anderson, Daim and Lavoie (2007) and Kim, Anderson and Daim (2008)  54 US universities (2001-2004)  DEA  Total research spending  Licensing income, licenses and options executed, startup companies, patents filed, patents issued  Ken et al. (2009)  94 US universities (2006)  DEA  Research expenditures, published articles, patents issued and invention disclosures  Licensing income  Caldera and Debande (2010)  52 Spanish universities (2001-2005)  Regression    R&D contracts income, number of R&D contracts, licensing income, number of licensing agreements and number of spin-offs  Curi, Daraio and Llerena (2012)  51 French universities (2003-2007)  DEA  Number of full time equivalent KTO employees, number of publications of the university  Patent applications, software applications  Berbegal-Mirabent, Lafuente and Solé (2013)  44 Spanish universities (2006-2009)  DEA  Total faculty, administrative staff, administrative expenses, R&D income  Graduates, number of papers published, number of spin offs created  Ho et al. (2014)  US universities  DEA  Fundin g resources  Patenting activities, licensing and entrepreneurship  View Large Table 1. Studies on the efficiency of universities’ knowledge transfer operations Study  Focus  Method  Inputs  Outputs  Rogers, Yin and Hoffmann (2000)  131 US universities (1996)  Regression    Number of invention disclosures, number of U.S. patents filed, number of licenses, number of start-up companies, gross licensing income  Thursby and Kemp (2002)   112 US universities (1991-1996)  DEA  Number of KTO staff, amount of government funds received, number and quality of faculty in several subjects  Sponsored research agreements with industry, number of licenses to private sector firms, royalty payments received, number of invention disclosures, university patent applications  Siegel, Waldman and Link (2003)  113 US universities (1991–1996)  SFE  Number of invention disclosures, number of KTO employees, legal expenditures  Number of licences or licensing income  Chapple et al. (2005)  50 UK universities (2002)  SFE and DEA  Number of invention disclosures, total research income, number of KTO staff, external legal intellectual property expenditure  Number of licences or licensing income  Anderson, Daim and Lavoie (2007) and Kim, Anderson and Daim (2008)  54 US universities (2001-2004)  DEA  Total research spending  Licensing income, licenses and options executed, startup companies, patents filed, patents issued  Ken et al. (2009)  94 US universities (2006)  DEA  Research expenditures, published articles, patents issued and invention disclosures  Licensing income  Caldera and Debande (2010)  52 Spanish universities (2001-2005)  Regression    R&D contracts income, number of R&D contracts, licensing income, number of licensing agreements and number of spin-offs  Curi, Daraio and Llerena (2012)  51 French universities (2003-2007)  DEA  Number of full time equivalent KTO employees, number of publications of the university  Patent applications, software applications  Berbegal-Mirabent, Lafuente and Solé (2013)  44 Spanish universities (2006-2009)  DEA  Total faculty, administrative staff, administrative expenses, R&D income  Graduates, number of papers published, number of spin offs created  Ho et al. (2014)  US universities  DEA  Fundin g resources  Patenting activities, licensing and entrepreneurship  Study  Focus  Method  Inputs  Outputs  Rogers, Yin and Hoffmann (2000)  131 US universities (1996)  Regression    Number of invention disclosures, number of U.S. patents filed, number of licenses, number of start-up companies, gross licensing income  Thursby and Kemp (2002)   112 US universities (1991-1996)  DEA  Number of KTO staff, amount of government funds received, number and quality of faculty in several subjects  Sponsored research agreements with industry, number of licenses to private sector firms, royalty payments received, number of invention disclosures, university patent applications  Siegel, Waldman and Link (2003)  113 US universities (1991–1996)  SFE  Number of invention disclosures, number of KTO employees, legal expenditures  Number of licences or licensing income  Chapple et al. (2005)  50 UK universities (2002)  SFE and DEA  Number of invention disclosures, total research income, number of KTO staff, external legal intellectual property expenditure  Number of licences or licensing income  Anderson, Daim and Lavoie (2007) and Kim, Anderson and Daim (2008)  54 US universities (2001-2004)  DEA  Total research spending  Licensing income, licenses and options executed, startup companies, patents filed, patents issued  Ken et al. (2009)  94 US universities (2006)  DEA  Research expenditures, published articles, patents issued and invention disclosures  Licensing income  Caldera and Debande (2010)  52 Spanish universities (2001-2005)  Regression    R&D contracts income, number of R&D contracts, licensing income, number of licensing agreements and number of spin-offs  Curi, Daraio and Llerena (2012)  51 French universities (2003-2007)  DEA  Number of full time equivalent KTO employees, number of publications of the university  Patent applications, software applications  Berbegal-Mirabent, Lafuente and Solé (2013)  44 Spanish universities (2006-2009)  DEA  Total faculty, administrative staff, administrative expenses, R&D income  Graduates, number of papers published, number of spin offs created  Ho et al. (2014)  US universities  DEA  Fundin g resources  Patenting activities, licensing and entrepreneurship  View Large Both Chapple et al. (2005) and Curi et al. (2012) find that KTOs exhibit low levels of absolute efficiency, and large inter-organisational variations. KTOs’ performance and efficiency are found to depend on the characteristics of the university institution, such as faculty quality, subject specialisation in biological sciences and engineering, private ownership, and presence of a medical school or university hospital (Thursby and Kemp, 2002; Anderson et al., 2007; Curi et al., 2012). University policies, including the definition of clear rules for academics (such as the regulation of potential conflicts of interest and the allocation of a larger proportion of royalties to the inventor), can improve performance by incentivising academics to engage in knowledge transfer (Caldera and Debande, 2010; Link and Siegel, 2005; Friedman and Silberman, 2003; Debackere and Veugelers, 2005; Belenzon and Schankerman, 2009; Lach and Schankerman, 2004). Also important are the economic characteristics of the region where the institution is based, and the characteristics of KTOs, including size, age, management practices and organisational structure (Bercovitz et al., 2001; Siegel et al., 2003; Debackere and Veugelers, 2005). However, some findings are contradictory. Some studies suggest that having a university hospital or medical school exerts a positive effect on efficiency (Siegel et al., 2003), while others find the opposite (Thursby and Kemp, 2002; Chapple et al., 2005; Anderson et al., 2007; Curi et al., 2012). Some find a positive effect of the KTO’s size on efficiency (Rogers et al., 2000; Thursby and Kemp, 2002; Caldera and Debande, 2010; Curi et al., 2012), while others find a negative effect (Chapple et al., 2005). Siegel et al. (2003) find that licensing revenues display increasing returns to scale and that licensing agreements display constant returns, while Chapple et al. (2005) find evidence of decreasing returns. Curi et al. (2012) find that patent filing exhibits variable returns to scale. While most of these studies aim to identify what determines the efficient performance of KTOs, it is increasingly acknowledged by academics and policymakers alike that knowledge transfer entails varied activities, not all of which are mediated by the university’s KTO (Research Councils UK, 2007; Hughes and Kitson, 2012). In order to explore universities’ efficiency in fulfilling their knowledge transfer mission, the analysis should include a broader range of outputs beyond patenting, licensing, spinoff creation and research contracting with industry. 2.2. Broadening the framework Broadening the analysis of knowledge transfer efficiency to include a larger set of knowledge transfer activities is fraught with problems. First, universities fulfil their knowledge transfer mission through many different activities, which may include: delivering knowledge-intensive services, such as consultancies, clinical tests, prototypes and professional development courses (CPDs); engaging in direct exchanges of knowledge with industry via informal networks and personnel exchanges; contributing to regeneration programmes impacting local communities; engaging with the public through different media; supporting academic spinoffs and graduate start-ups; and producing, protecting and licensing IP. Not only may this list may incomplete, but data on some of these activities are very unreliable, when they are collected at all. To complicate matters, it is sometimes difficult to distinguish between knowledge transfer, research and teaching activities. Second, the outputs of these varied activities cannot all be measured using the same metrics. While some can be measured in terms of the monetary income they produce, not all of them generate income. Moreover, outputs can be very different in terms of quality and impact. Third, varied types of universities, differing in research intensity, subject specialisation, resources and engagement with their environment, very often have very different knowledge transfer strategies (Deiaco et al., 2012), so it is difficult to identify appropriate terms of comparison. Hewitt-Dundas (2012) shows that British universities with high and low research intensity pursue different knowledge transfer objectives, strategies and activities. The former primarily emphasise collaborations with industry, while the latter prioritise supporting SMEs and developing human capital (widening access to education, retaining graduates in the region and contributing to local skills needs). These different priorities are reflected in their different activities and income sources. Highly research-intensive universities undertake more contract research, collaborative research and licensing, while less research-intensive ones undertake more consultancies and CPDs. The former accrue more income from research councils, international sources and large businesses, while the latter’s main sources of income are the national government, SMEs and non-commercial partners. Wright et al. (2008) suggest that, compared with top universities, mid-range universities in the UK, Belgium, Germany and Sweden engage in a broader range of knowledge transfer activities. Since university departments include staff specialised in closely linked subjects that tend to rely upon similar knowledge transfer channels, they may be more comparable than entire institutions. In fact, several studies of efficiency in research and/or teaching take departments as their units of analysis (Johnes and Johnes, 1993; Chiang and His-Tai, 2006; Agasisti and Bonomi, 2013). However, departmental data on knowledge transfer activities are rarely available, and all the studies that have analysed efficiency in knowledge transfer to date have used institution-level data. The following Figure 1 presents a conceptual model of knowledge transfer as a process of transformation of generic (teaching and research resources) and dedicated (internal knowledge transfer resources) university inputs, into knowledge transfer outputs. It extends the frameworks previously adopted in the analysis of knowledge transfer efficiency, such as those presented by Thursby and Thursby (2002) and Anderson et al. (2007), in two main ways. First, it considers a broader range of knowledge transfer outputs. While previous studies have focused on IP disclosures and sometimes their further commercial exploitation by means of licensing and spinning off, this model also includes other activities: stipulation of contracts and consultancies, provision of CPDs, public engagement. Second, the model acknowledges that some knowledge transfer outputs, such as spinoffs and licenses, result from a secondary transformation process that uses other knowledge transfer outputs (university IP) as inputs. Often, this secondary transformation process builds upon additional, dedicated inputs: for example, spinoffs rely on dedicated infrastructures like incubators and science parks, as well as on additional sources of seed funding; patenting and licensing operations are often outsourced to external agencies. In a study that similarly conceptualised knowledge transfer as a two-stage process of research ‘concretisation’ (resulting in patents) and research ‘commercialisation’ (leading to licensing and spinoffs), where the former are inputs for the latter, Ho et al. (2014) found that indeed these are two different transformation processes that build on different capabilities. Since this study aims to analyse the university’s efficiency in transforming internal resources into a broad range of outputs, rather than to track the KTO’s efficiency in protecting and commercialising IP, it focuses on the former process only. Figure 1. View largeDownload slide A conceptual model of the knowledge transfer transformation process Figure 1. View largeDownload slide A conceptual model of the knowledge transfer transformation process This simple conceptual model is used, in the next section, to explain the study’s methodological choices in order to measure knowledge transfer efficiency; it does not capture all the complex transformation processes occurring within universities, of which knowledge transfer constitutes only a part, nor all the possible knowledge transfer activities that universities can, in principle, perform. As a further step, the study also assesses what institutional and environmental factors drive greater efficiency. Institutional factors can affect the efficiency of the university’s knowledge transfer operations, because they constrain the availability of inputs to be deployed in knowledge transfer processes, the manner of their deployment and the opportunities to generate knowledge transfer outputs. The university’s research intensity matters: the greater the share of resources that a university dedicates to research activities, the more it can—at least according to the established ‘technology transfer’ model based on patenting, contract research and spinoff creation—generate knowledge outputs that can be transferred to external stakeholders (Antonelli, 2008). For a given amount of generic resources employed, institutions that are more research intensive can be expected to produce more patents and engage in more contract research with industry partners (Hewitt-Dundas, 2012) and hence enjoy greater efficiency in the production of these outputs. At the same time, opportunities for knowledge transfer also arise from teaching activities (for example, universities can offer CPDs, and support student entrepreneurship), so when a broad range of knowledge transfer outputs is considered, both research-intensive and teaching-intensive institutions can be expected to achieve efficiency, although possibly with different output mixes. The institutional support provided to knowledge transfer activities can also influence their efficiency. An institution that invests in more and better-trained knowledge transfer staff can generate more opportunities for knowledge transfer with the same generic inputs: it has been shown that having more competent and experienced knowledge transfer staff results in better performance (Friedman and Silberman, 2003; Lockett and Wright, 2005; Siegel et al., 2003). As knowledge transfer entails a variety of activities, universities that engage in a large range of academic subjects can be expected to better capture any opportunities that come along (Wright et al., 2008): given the same inputs, universities with a diversified subject profile should be able to generate more knowledge transfer outputs. The age, size and subject mix of the university may also matter. Since larger, older institutions often enjoy greater reputation, they can access better-quality inputs and exploit more opportunities to generate outputs. The size of the university is positively related to the level of knowledge transfer (Belenzon and Schankerman, 2009), measured in terms of private research funds (Von Tunzelmann and Kraemer Mbula, 2003), interactions with companies (Bruno and Orsenigo, 2003; Landry et al., 2007) and spinoff creation (O’Shea et al., 2005). Subject specialisation can play a role in efficiency, since industry collaborates more frequently with academics in applied sciences like engineering, computer science, biotechnology and medicine (Levin et al., 1987; Schartinger et al., 2001; Meyer-Krahmer and Schmoch, 1998; Cohen et al., 2002). This is not to say, however, that other academics do not transfer knowledge: academics in the social sciences and arts and humanities engage in public lectures and performances, media involvement, regeneration and community projects (Castro-Martinez et al., 2010; Olmos-Penuela et al., 2011; Hughes and Kitson, 2012). Environmental factors may affect both the availability of inputs and the opportunities to generate outputs. Since research-intensive firms are more likely to contract out research to universities and rely on academic consultancies, the research intensity of local firms may affect the production of knowledge transfer outputs. The literature has consistently found positive effects on efficiency of various measures of research intensity of local businesses, including level of industrial R&D in the region, rate of growth of private R&D, business R&D expenditure per capita and share of employment in high-tech sectors (Chapple et al., 2005; Curi et al., 2012; Berbegal Mirabent et al., 2013). Many studies also consider the general economic prosperity of the region, since more prosperous regions are more likely to include better-performing firms, and therefore offer more opportunities for knowledge transfer. Here the results have been mixed, with regional economic performance having either positive (Chapple et al., 2005), negative (Curi et al., 2012) or no effects (Siegel et al., 2003) on efficiency, suggesting that the composition of regional industry matters more than overall regional economic performance. Other environmental factors that can influence the efficiency of university institutions are legislation and regulations and public policies supporting knowledge transfer. 3. Methodology 3.1. Data Data for this study are drawn from three sources. The first is the HEBCI survey, which collects information about universities’ knowledge transfer infrastructures, strategies and engagement. The HEBCI survey has run since 1999, although it has changed over time so not all variables are available across all years. It currently covers all of the country’s 161 institutions, of which 131 are based in England, 16 in Scotland, 10 in Wales and 4 in Northern Ireland; 134 are universities that offer undergraduate, postgraduate and doctoral degrees, while 27 are higher-education colleges that do not grant doctoral degrees and are mainly specialised in music, visual and performing arts or agriculture. The survey includes a broad range of activities spanning collaborative research and regeneration programmes, contract research, consultancies, IP protection and licensing, spinoff activities, CPDs, equipment and facilities-related services and public engagement (public lectures, performances, exhibitions, museum education and other events). It measures a greater variety of knowledge transfer outputs than other well-known surveys, such as the AUTM survey in the USA or the NSRC in Australia (Rosli and Rossi, 2015). The second source is the database Heidi, which contains data on universities’ financial, human and capital resources, as well as their teaching and research activities. Finally, information on regional gross value added per capita and regional business R&D expenditure (BERD) has been collected from the UK’s Office for National Statistics. 3.2. Empirical strategy Efficiency comparisons between different organisations engaging in similar transformation processes are usually performed using either SFE (Aigner et al., 1977; Meeusen and Van den Broeck, 1977) or DEA (DEA; Charnes et al., 1978). SFE requires the estimation of a production function, where differences in performance across units are attributed to an error term, εi, which has two components (εi = Vi - Ui): statistical noise Vi (a symmetric, independently and normally distributed random error component) and an inefficiency component Ui. The latter is a non-negative error term which accounts for the failure to produce the maximum output, given the set of inputs used; it is assumed to be independently and half-normally distributed (as units are either on the frontier or below it). To test the determinants of inefficiency, the term Ui is then regressed onto a set of independent and control variables. In more recent models (Battese and Coelli, 1995), both the production function (including the inefficiency term) and the determinants of relative inefficiency are estimated simultaneously. While SFE, as a parametric approach, has several advantages (see Chapple et al., 2005, for a discussion), it can only model single-output production processes. The analysis of processes that involve the simultaneous production of different outputs require to either estimate alternative models (one for each different output, as in Chapple et al., 2005) or aggregate the outputs using a common metrics such as their monetary value, if market prices exist (Ray, 2004). The other widely used approach to efficiency computation, DEA, consists in numerically computing an efficiency frontier of the best-performing units, and positioning the other units in relation to this frontier. This method has been used extensively to compute the efficiency of production processes that generate multiple outputs, some of which may not have easily identifiable market prices, such as in the education sector (Charnes et al., 1994). It is therefore more suitable to computing the efficiency of knowledge transfer processes when not all relevant outputs can be expressed in monetary terms. Fitting the linear frontier requires identifying, for each combination of inputs used by the observed units, the maximum output that could be produced given that input combination: the set of maximum output/input ratios constitutes the efficient frontier, and the relative efficiency of each unit can be computed by comparing the unit’s actual output with the maximum output that could be produced using the same combination of inputs. In practice, this requires solving a linear program: finding the set of weights that maximise each unit’s average productivity (ratio of its weighted combination of outputs to its weighted combination of inputs) subject to the constraints that all weights are non-negative and all ratios are smaller or equal to one (maximum efficiency is imposed to be equal to 1). Once the efficient frontier has been computed, the efficiency of each unit relative to the frontier is measured using a distance function. Usually, inefficiency is presented in terms of a score 1λ≤1 (Shepherd distance) which identifies the fraction of the unit’s actual output to its corresponding optimal output (the maximum output obtainable given the combination of inputs used by the unit): correspondingly, the reciprocal score λ≥1 (Farrell-Debreu distance) identifies the increase in output that the unit would need to accomplish if it was to become technically efficient. DEA models come in different specifications. Efficiency can be computed in terms of maximum output that can be produced given a certain combination of inputs (output-oriented model), or of minimum inputs that can be used to produce a given output (input-oriented model). The model can accommodate constant or variable returns to scale. Different types of efficiency frontiers can be fitted: piecewise linear (‘convex hull’; Charnes et al., 1978), or staircase shaped (‘free disposal hull’; Deprins et al., 1984). This study uses DEA to measure the efficiency of universities’ knowledge transfer processes by positioning each university relative to the non-parametric frontier of the most efficient institutions. The choice of outputs and inputs used in the measurement of efficiency follows the model shown in Figure 1, taking into account the availability of reliable and comparable data. Income, when available, is considered as a reliable measure of output: engagement in research contracts, consultancies3 and CPDs is proxied by the income they accrued. The use of income, rather than just the number of contracts, also accounts for the fact that contracts can differ widely in quality and value. Two further outputs that are not measured in monetary terms are then considered. First, the number of IP disclosures, which include inventions, computer software and databases, literary and artistic works, educational software and multimedia, industrial designs, trademarks, integrated circuit topographies and new plant or animal varieties. This encompassing definition better captures the amount of new IP generated than the simple number of patents filed. Considering the number of disclosures, rather than the income generated from their licensing or sale, is not only consistent with the model presented in Figure 1, but also accounts for the fact that some disclosures (such as software released through open source licenses) do not produce income (Sorensen and Chambers, 2008). Second, outputs include the number of academic staff days employed in public engagement. Some knowledge transfer outputs are not included in the model. Income from services and equipment-related services is excluded, because this figure comprises room rentals, a service that does not entail transfer of knowledge. Incomes from collaborative research and regeneration and development programmes are also excluded. Both incomes usually derive from public funds allocated competitively, primarily by the research councils and by international and national governmental bodies. The reasons for excluding them are twofold. First, it is not possible to distinguish between income used to support research and income used for knowledge transfer activities. Second, the possibility to access these funds strongly depends on rules issued by the funding bodies (for example, only universities in certain regions may be eligible for regeneration funding), which we are not able to fully control for. It is possible that some forms of regeneration income may be recorded under different headings (such as consultancy or CPD income), although the amounts are likely to be small.4 Finally, consistent with the model presented in Figure 1, spinoffs are excluded as they generally build upon dedicated resources, hence their cost would not be accounted for in the generic inputs measured. Inputs comprise several general resources that universities use in the production and transfer of knowledge (research and teaching funds from funding council grants and from tuition fees, number of academic staff) as well as some specific resources dedicated to knowledge transfer (number of staff employed in knowledge transfer functions). Since there may be a lag between the use of inputs and the production of outputs, five-year averages of the period 2006–2011 are deployed for both inputs and outputs5 (averages over several years have also been used by Thursby and Kemp, 2002; Anderson et al., 2007; Curi et al., 2012). The dataset includes 160 institutions for which complete five-year data on inputs and outputs exist. The efficiency estimates are computed for a reduced sample of 97 universities that employ strictly positive quantities of all the inputs and outputs considered, to allow comparability. The 97 universities included in the sample do not have a significantly different geographical distribution than the 63 that have been excluded. However, they differ in respect to several institutional characteristics. Those that have been excluded are on average significantly younger, have younger KTOs and are less likely to have a university hospital. They have a significantly lower average share of academic staff in the natural sciences, medicine, engineering and technical subjects, and the social sciences, and a higher average share of academic staff in the arts and humanities. In fact, many institutions that have a high share of staff in the arts and humanities (which include most higher-education colleges) do not generate IP disclosures (this was the case for 52 out of these 63 institutions; of the remaining 11, 9 did not report any days spent on public engagement, 1 did not report any consultancy income and 1 did not report any income from research contracts). Table 2 describes the input and output variables used to compute the universities’ efficiency, and reports their main descriptive statistics. The correlation matrix between inputs and outputs is reported in Appendix A. Table 2. Inputs and outputs used in the computation of DEA efficiency scores, and their main descriptive statistics Variable name  Description  N  Mean  Standard Deviation  Minimum value  Maximum value  Inputs:              Research and teaching funding  Research and teaching grants from funding councils and tuition fees (£000)  97  136,506.1  77,010.2  16,709.9  400,291.5  Knowledge transfer staff  Number (headcount) of staff specifically employed in a knowledge transfer capacity  97  53.5  44.0  3.0  207.6  Academic staff  Number (full time equivalent, FTE) of academic staff  97  1,715.3  1,428.0  158.6  6,976.0  Outputs:              Income  Income from research and consultancy contracts and CPDs (£000)  97  16,044.4  18,687.2  95.6  106,833.4  Intellectual property disclosures  Number of intellectual property disclosures  97  39.7  56.7  0.2  315.6  Public engagement  Number of academic days employed in public engagement tasks  97  805.0  1,600.5  2.0  11,126.0  Variable name  Description  N  Mean  Standard Deviation  Minimum value  Maximum value  Inputs:              Research and teaching funding  Research and teaching grants from funding councils and tuition fees (£000)  97  136,506.1  77,010.2  16,709.9  400,291.5  Knowledge transfer staff  Number (headcount) of staff specifically employed in a knowledge transfer capacity  97  53.5  44.0  3.0  207.6  Academic staff  Number (full time equivalent, FTE) of academic staff  97  1,715.3  1,428.0  158.6  6,976.0  Outputs:              Income  Income from research and consultancy contracts and CPDs (£000)  97  16,044.4  18,687.2  95.6  106,833.4  Intellectual property disclosures  Number of intellectual property disclosures  97  39.7  56.7  0.2  315.6  Public engagement  Number of academic days employed in public engagement tasks  97  805.0  1,600.5  2.0  11,126.0  View Large Table 2. Inputs and outputs used in the computation of DEA efficiency scores, and their main descriptive statistics Variable name  Description  N  Mean  Standard Deviation  Minimum value  Maximum value  Inputs:              Research and teaching funding  Research and teaching grants from funding councils and tuition fees (£000)  97  136,506.1  77,010.2  16,709.9  400,291.5  Knowledge transfer staff  Number (headcount) of staff specifically employed in a knowledge transfer capacity  97  53.5  44.0  3.0  207.6  Academic staff  Number (full time equivalent, FTE) of academic staff  97  1,715.3  1,428.0  158.6  6,976.0  Outputs:              Income  Income from research and consultancy contracts and CPDs (£000)  97  16,044.4  18,687.2  95.6  106,833.4  Intellectual property disclosures  Number of intellectual property disclosures  97  39.7  56.7  0.2  315.6  Public engagement  Number of academic days employed in public engagement tasks  97  805.0  1,600.5  2.0  11,126.0  Variable name  Description  N  Mean  Standard Deviation  Minimum value  Maximum value  Inputs:              Research and teaching funding  Research and teaching grants from funding councils and tuition fees (£000)  97  136,506.1  77,010.2  16,709.9  400,291.5  Knowledge transfer staff  Number (headcount) of staff specifically employed in a knowledge transfer capacity  97  53.5  44.0  3.0  207.6  Academic staff  Number (full time equivalent, FTE) of academic staff  97  1,715.3  1,428.0  158.6  6,976.0  Outputs:              Income  Income from research and consultancy contracts and CPDs (£000)  97  16,044.4  18,687.2  95.6  106,833.4  Intellectual property disclosures  Number of intellectual property disclosures  97  39.7  56.7  0.2  315.6  Public engagement  Number of academic days employed in public engagement tasks  97  805.0  1,600.5  2.0  11,126.0  View Large The empirical strategy involves, first, checking whether the adoption of a broad approach to measuring the efficiency of the knowledge transfer transformation process produces sensibly different results compared with the adoption of a narrow approach that only considers the creation of new IP and contracts with industry. Several knowledge transfer inputs are used, at the same time, for research and teaching—for example, the time of academic staff and the funds from government grants and tuition fees. Therefore, it is not possible to precisely identify how much of these inputs goes in the production of knowledge transfer outputs; as Thursby and Kemp (2002) note, we would not be able to say whether a university that has a higher number of patent applications than another, vis-à-vis a given amount of inputs, is more efficient than the latter or is simply allocating more of its inputs to activities that are more likely to produce patentable outputs. However, using a broad definition of knowledge transfer should mitigate this problem: since the range of knowledge transfer outputs considered draws upon a wide variety of university resources, universities could allocate their inputs differently among teaching and research, or among the social and natural sciences, and still enjoy similar opportunities for knowledge transfer. It should then be possible to observe an increase in the relative efficiency of universities that allocate more inputs to activities that do not fit well with the model of technology transfer based on the commercialisation of IP and the performance of contract research. To check whether this is the case, the DEA efficiency scores of universities are computed under two different model specifications: a narrow model which includes only two outputs capturing universities’ efficiency in technology transfer (number of IP disclosures and income from research contracts) and a broad model which includes five outputs (number of IP disclosures, income from research contracts, consultancies and CPDs, and number of academic staff days employed in public engagement). The efficiency scores, in both models, are computed using the output-oriented6 DEA linear program with non-increasing returns to scale7 implemented in the rDEA package for R Cran. Having ranked universities according to their efficiency scores in each model, it is then possible to examine the characteristics of the universities that improve their relative rank positions8 in the broad model of knowledge transfer, compared to the narrow one. The second step involves exploring the institutional and environmental determinants of efficiency, focusing on the broad model of knowledge transfer. Two alternative dependent variables are used. The first dependent variable is an indicator variable called INEFFICIENT that takes on the value one if the university is inefficient and zero otherwise (as in Thursby and Kemp, 2002).9 The regressors used in this logit model (Model 1) capture a varied range of institutional and environmental factors: research intensity is measured as the ratio of public research funding grants to academic staff, Public research funds per academic staff); teaching intensity is measured as the number of Students per academic staff (including undergraduate, postgraduate and research students); the age of the institution’s KTO (KTO age) is used as a proxy for the experience of the institution with knowledge transfer; the subject diversification of academic staff (measured as the inverse of the Herfindahl index on the shares of academic staff in each of four main subject areas—natural sciences and medicine, engineering and technology, social sciences and business, arts and humanities) quantifies how broad is the range of subjects that the university offers, and the relative weight of each subject: the more ‘equal’ the subject shares, the higher the index. A low value implies that the university is more specialised, while a higher value implies that the university is more diversified (Subject diversification); regional gross value added per capita in 2006 (variable Regional GVA) and level of business R&D spending in the region in 2006 (variable Regional BERD) capture the opportunities offered by the university’s economic environment; the number of other universities in the same city is used to proxy the intensity of competition among them (variable City Universities); several controls are also included: the age of the institution (Age), its size in terms of total staff (Size), whether a university hospital is present (University hospital), the shares of academic staff in the natural sciences and medicine (Science and medicine), technical and engineering subjects (Technical), the social sciences and business (Social) and the arts and humanities (Arts and Humanities) and the country (England, Scotland, Wales or Northern Ireland). As the universities in the sample are all public, ownership is not controlled for. The regressors refer to 2006/07 because the aim is to test the effect of institutional and environmental variables, which affect both inputs and outputs, on efficiency measured with respect to the subsequent five years. The second dependent variable is the efficiency score (in the Farrell-Debreu distance formulation; higher score indicates greater distance from the efficient frontier, therefore greater inefficiency) obtained by the institution (Model 2). In order to perform inference on the determinants of the efficiency of different institutions, numerous studies have used two-stage estimations, whereby efficiency scores are estimated in the first stage and then regressed using OLS or Tobit models on several institutional and environmental variables in the second stage (an overview is presented in Simar and Wilson [2007]). However, it has been pointed out that the DEA efficiency scores are serially correlated, which invalidates standard approaches regarding statistical inference (Simar and Wilson, 2000). The bias correction procedure suggested by Simar and Wilson (2007)10 is therefore used to perform inference on the efficiency scores. 4. Empirical results 4.1. Comparison of universities’ rank positions under the narrow and broad models of knowledge transfer Universities have been ranked according to their efficiency scores computed under the narrow and the broad models of knowledge transfer. Thirty-one universities improve their rank position in the broad model compared to the narrow model, while 66 universities either maintain or worsen their rank position. In the broad model, more universities achieve efficiency (ten instead of five), mean efficiency is higher (0.49 instead of 0.33) and the distribution of efficiency scores is less skewed (skewness is 0.67 instead of 1.07). Universities that improve their rank position in the broad model have a lower share of staff in medicine and natural sciences and a higher share of staff in the arts and humanities, and are less likely to have a university hospital, than those that do not improve their position (Table 3). This supports the conjecture that universities that allocate more inputs to activities that do not fit well with the established technology transfer model based on patents and research contracts (which is particularly suited to medicine, biotechnology and a few other technical fields, but not relevant to all subject areas) do better when a broader set of outputs is considered. Also, universities that improve their rank position in the broad model have fewer public research funds per academic and more students per academic, consistently with literature suggesting that less research-intensive universities have a broader knowledge transfer portfolio (Hewitt-Dundas, 2012; Wright et al., 2008). Considering the differences across mean amounts of inputs and outputs in the two groups, the universities that improve their rank position deliver significantly more days of public events and receive significantly more income from CPDs (but significantly less income from research contracts) than those that do not improve their position. This confirms that their improved efficiency is due to their ability to engage in a varied portfolio of activities that is not taken into account when considering only a narrow range of outputs. Table 3. Characteristics of universities that have improved or not improved their rank position when moving from a narrow to a broad model of knowledge transfer     Not improved (66 universities)  Improved (31 universities)  Statistic  p-value  Mean share of academic staff  Medicine and natural sciences  28.3%  20.3%  2.621  0.011  Technical subjects / engineering  10.9%  9.7%  0.704  0.484  Social sciences and business  11.9%  11.5%  0.353  0.725  Arts and humanities  11.7%  16.9%  -2.232  0.031    Age of institution  143.7  148.0  -0.134  0.894    Age of KTO  15.34  13.27  1.035  0.305    University hospital  25.8%  6.5%  2.743  0.007    Public research funds per academic  7.605  4.594  2.766  0.007    Students per academic  9.706  12.323  -2.525  0.015    Subject specialization  8.226  9.242  1.283  0.205  Mean amounts of inputs and outputs  Research and teaching funding  136,906.4  135,654.0  0.068  0.946  Knowledge transfer staff  53.1  54.5  -0.150  0.881  Academic staff  1,790.4  1,555.4  0.744  0.460  Intellectual property disclosures  45.7  26.9  1.493  0.141  Income from research contracts  10,204.7  5,101.1  1.901  0.060  Income from consultancy contracts  2,615.9  3,858.8  -1.371  0.177  Income from CPDs  3,779.8  5,900.9  -1.820  0.076  Public engagement  444.5  1,572.5  -2.405  0.022      Not improved (66 universities)  Improved (31 universities)  Statistic  p-value  Mean share of academic staff  Medicine and natural sciences  28.3%  20.3%  2.621  0.011  Technical subjects / engineering  10.9%  9.7%  0.704  0.484  Social sciences and business  11.9%  11.5%  0.353  0.725  Arts and humanities  11.7%  16.9%  -2.232  0.031    Age of institution  143.7  148.0  -0.134  0.894    Age of KTO  15.34  13.27  1.035  0.305    University hospital  25.8%  6.5%  2.743  0.007    Public research funds per academic  7.605  4.594  2.766  0.007    Students per academic  9.706  12.323  -2.525  0.015    Subject specialization  8.226  9.242  1.283  0.205  Mean amounts of inputs and outputs  Research and teaching funding  136,906.4  135,654.0  0.068  0.946  Knowledge transfer staff  53.1  54.5  -0.150  0.881  Academic staff  1,790.4  1,555.4  0.744  0.460  Intellectual property disclosures  45.7  26.9  1.493  0.141  Income from research contracts  10,204.7  5,101.1  1.901  0.060  Income from consultancy contracts  2,615.9  3,858.8  -1.371  0.177  Income from CPDs  3,779.8  5,900.9  -1.820  0.076  Public engagement  444.5  1,572.5  -2.405  0.022  View Large Table 3. Characteristics of universities that have improved or not improved their rank position when moving from a narrow to a broad model of knowledge transfer     Not improved (66 universities)  Improved (31 universities)  Statistic  p-value  Mean share of academic staff  Medicine and natural sciences  28.3%  20.3%  2.621  0.011  Technical subjects / engineering  10.9%  9.7%  0.704  0.484  Social sciences and business  11.9%  11.5%  0.353  0.725  Arts and humanities  11.7%  16.9%  -2.232  0.031    Age of institution  143.7  148.0  -0.134  0.894    Age of KTO  15.34  13.27  1.035  0.305    University hospital  25.8%  6.5%  2.743  0.007    Public research funds per academic  7.605  4.594  2.766  0.007    Students per academic  9.706  12.323  -2.525  0.015    Subject specialization  8.226  9.242  1.283  0.205  Mean amounts of inputs and outputs  Research and teaching funding  136,906.4  135,654.0  0.068  0.946  Knowledge transfer staff  53.1  54.5  -0.150  0.881  Academic staff  1,790.4  1,555.4  0.744  0.460  Intellectual property disclosures  45.7  26.9  1.493  0.141  Income from research contracts  10,204.7  5,101.1  1.901  0.060  Income from consultancy contracts  2,615.9  3,858.8  -1.371  0.177  Income from CPDs  3,779.8  5,900.9  -1.820  0.076  Public engagement  444.5  1,572.5  -2.405  0.022      Not improved (66 universities)  Improved (31 universities)  Statistic  p-value  Mean share of academic staff  Medicine and natural sciences  28.3%  20.3%  2.621  0.011  Technical subjects / engineering  10.9%  9.7%  0.704  0.484  Social sciences and business  11.9%  11.5%  0.353  0.725  Arts and humanities  11.7%  16.9%  -2.232  0.031    Age of institution  143.7  148.0  -0.134  0.894    Age of KTO  15.34  13.27  1.035  0.305    University hospital  25.8%  6.5%  2.743  0.007    Public research funds per academic  7.605  4.594  2.766  0.007    Students per academic  9.706  12.323  -2.525  0.015    Subject specialization  8.226  9.242  1.283  0.205  Mean amounts of inputs and outputs  Research and teaching funding  136,906.4  135,654.0  0.068  0.946  Knowledge transfer staff  53.1  54.5  -0.150  0.881  Academic staff  1,790.4  1,555.4  0.744  0.460  Intellectual property disclosures  45.7  26.9  1.493  0.141  Income from research contracts  10,204.7  5,101.1  1.901  0.060  Income from consultancy contracts  2,615.9  3,858.8  -1.371  0.177  Income from CPDs  3,779.8  5,900.9  -1.820  0.076  Public engagement  444.5  1,572.5  -2.405  0.022  View Large 4.2. Inputs, outputs and their relationship with efficiency The remaining part of the analysis explores the institutional and environmental determinants of efficiency considering the broad model of knowledge transfer. There are ten efficient universities and 87 inefficient ones. Consistently with findings by Chapple et al. (2005) and Curi et al. (2012), the analysis of the efficiency scores shows low levels of absolute efficiency (the mean efficiency score is 0.49) and marked variability between universities (standard deviation is 0.22). To explore the relationship between inputs, outputs and efficiency, inputs and outputs are regressed on the indicator variable INEFFICIENT. The logit regression in Table 4 (column (1)) shows that inputs have, as expected, a significantly positive relationship with inefficiency (except for the number of academic staff, which is not significant). Outputs have, as expected, a negative relationship with inefficiency (although income is not significant). As a robustness check, the same regression is run only using the significant variables in column (1) (column (2)), which confirms their signs and significance. Therefore, inefficient universities produce significantly less outputs (especially IP disclosures and public engagement) than efficient ones, while using relatively more generic funds for research and teaching and more dedicated knowledge transfer staff. Table 4. Regression of inputs and outputs on the INEFFICIENT variable VARIABLES  (1)  (2)  Research and teaching funding  0.000**  0.000**    (0.000)  (0.000)  Knowledge transfer staff  0.042*  0.035*    (0.026)  (0.021)  Academic staff  -0.000      (0.002)    Income  -0.000      (0.000)    Intellectual property disclosures  -0.034*  -0.048***    (0.019)  (0.015)  Public engagement  -0.003*  -0.003***    (0.001)  (0.001)  Constant  0.663  1.121    (1.074)  (0.967)  Observations  97  97  Wald Chi-Square  37.23  35.43  Prob>Chi-Square  0.000  0.000  Pseudo R2  0.657  0.631  VARIABLES  (1)  (2)  Research and teaching funding  0.000**  0.000**    (0.000)  (0.000)  Knowledge transfer staff  0.042*  0.035*    (0.026)  (0.021)  Academic staff  -0.000      (0.002)    Income  -0.000      (0.000)    Intellectual property disclosures  -0.034*  -0.048***    (0.019)  (0.015)  Public engagement  -0.003*  -0.003***    (0.001)  (0.001)  Constant  0.663  1.121    (1.074)  (0.967)  Observations  97  97  Wald Chi-Square  37.23  35.43  Prob>Chi-Square  0.000  0.000  Pseudo R2  0.657  0.631  Standard errors in parentheses + p<0.15, *** p<0.01, ** p<0.05, * p<0.1 View Large Table 4. Regression of inputs and outputs on the INEFFICIENT variable VARIABLES  (1)  (2)  Research and teaching funding  0.000**  0.000**    (0.000)  (0.000)  Knowledge transfer staff  0.042*  0.035*    (0.026)  (0.021)  Academic staff  -0.000      (0.002)    Income  -0.000      (0.000)    Intellectual property disclosures  -0.034*  -0.048***    (0.019)  (0.015)  Public engagement  -0.003*  -0.003***    (0.001)  (0.001)  Constant  0.663  1.121    (1.074)  (0.967)  Observations  97  97  Wald Chi-Square  37.23  35.43  Prob>Chi-Square  0.000  0.000  Pseudo R2  0.657  0.631  VARIABLES  (1)  (2)  Research and teaching funding  0.000**  0.000**    (0.000)  (0.000)  Knowledge transfer staff  0.042*  0.035*    (0.026)  (0.021)  Academic staff  -0.000      (0.002)    Income  -0.000      (0.000)    Intellectual property disclosures  -0.034*  -0.048***    (0.019)  (0.015)  Public engagement  -0.003*  -0.003***    (0.001)  (0.001)  Constant  0.663  1.121    (1.074)  (0.967)  Observations  97  97  Wald Chi-Square  37.23  35.43  Prob>Chi-Square  0.000  0.000  Pseudo R2  0.657  0.631  Standard errors in parentheses + p<0.15, *** p<0.01, ** p<0.05, * p<0.1 View Large 4.3. The institutional and environmental drivers of efficiency The institutional and environmental variables, as well as several control variables, are then regressed on, separately, the INEFFICIENT variable (Model 1) and the efficiency scores (Model 2). Table 5 reports the main descriptive statistics on the variables used in the regressions. The correlations between the regressors are reported in Appendix B. Table 5. Descriptive statistics on the variables used in the regressions Variable name  Description  N  Mean  Standard Deviation  Minimum value  Maximum value  Public research funds per academic  Ratio of public funds for research to academic staff  97  6.64  5.39  0.34  18.99  Students per academic  Number of undergraduate and postgraduate students per academic  97  10.54  4.66  1.51  20.48  KTO age  Age of the KTO in years  97  14.68  8.84  0.00  41.95  Subject diversification  Subject diversity of academic staff  97  8.55  3.64  1.56  21.78  Regional GVA  Regional gross value added per capita  97  20919.87  6580.56  14842.00  34200.00  Regional BERD  Regional business expenditure on R&D  97  1306.88  1128.21  157.00  3948.00  City universities  Number of universities in the same city  97  9.88  15.47  1.00  40.00  Age  Age of the institution in years  97  145.05  137.53  19.00  915.00  Size  Total number of staff  97  2581.80  1867.65  302.00  9968.00  University hospital  Presence of a university hospital  97  0.20  0.40  0.00  1.00  Science and medicine  Share of academic staff in the natural sciences and medicine  97  0.26  0.16  0.00  0.80  Social  Share of academic staff in the social sciences and business  97  0.11  0.07  0.00  0.47  Technical  Share of academic staff in technical and engineering subjects  97  0.12  0.06  0.00  0.31  Arts and humanities  Share of academic staff in the arts and humanities  97  0.13  0.10  0.00  0.58  England    97  0.80  0.40  0.00  1.00  Scotland    97  0.11  0.32  0.00  1.00  Northern Ireland    97  0.02  0.14  0.00  1.00  Variable name  Description  N  Mean  Standard Deviation  Minimum value  Maximum value  Public research funds per academic  Ratio of public funds for research to academic staff  97  6.64  5.39  0.34  18.99  Students per academic  Number of undergraduate and postgraduate students per academic  97  10.54  4.66  1.51  20.48  KTO age  Age of the KTO in years  97  14.68  8.84  0.00  41.95  Subject diversification  Subject diversity of academic staff  97  8.55  3.64  1.56  21.78  Regional GVA  Regional gross value added per capita  97  20919.87  6580.56  14842.00  34200.00  Regional BERD  Regional business expenditure on R&D  97  1306.88  1128.21  157.00  3948.00  City universities  Number of universities in the same city  97  9.88  15.47  1.00  40.00  Age  Age of the institution in years  97  145.05  137.53  19.00  915.00  Size  Total number of staff  97  2581.80  1867.65  302.00  9968.00  University hospital  Presence of a university hospital  97  0.20  0.40  0.00  1.00  Science and medicine  Share of academic staff in the natural sciences and medicine  97  0.26  0.16  0.00  0.80  Social  Share of academic staff in the social sciences and business  97  0.11  0.07  0.00  0.47  Technical  Share of academic staff in technical and engineering subjects  97  0.12  0.06  0.00  0.31  Arts and humanities  Share of academic staff in the arts and humanities  97  0.13  0.10  0.00  0.58  England    97  0.80  0.40  0.00  1.00  Scotland    97  0.11  0.32  0.00  1.00  Northern Ireland    97  0.02  0.14  0.00  1.00  View Large Table 5. Descriptive statistics on the variables used in the regressions Variable name  Description  N  Mean  Standard Deviation  Minimum value  Maximum value  Public research funds per academic  Ratio of public funds for research to academic staff  97  6.64  5.39  0.34  18.99  Students per academic  Number of undergraduate and postgraduate students per academic  97  10.54  4.66  1.51  20.48  KTO age  Age of the KTO in years  97  14.68  8.84  0.00  41.95  Subject diversification  Subject diversity of academic staff  97  8.55  3.64  1.56  21.78  Regional GVA  Regional gross value added per capita  97  20919.87  6580.56  14842.00  34200.00  Regional BERD  Regional business expenditure on R&D  97  1306.88  1128.21  157.00  3948.00  City universities  Number of universities in the same city  97  9.88  15.47  1.00  40.00  Age  Age of the institution in years  97  145.05  137.53  19.00  915.00  Size  Total number of staff  97  2581.80  1867.65  302.00  9968.00  University hospital  Presence of a university hospital  97  0.20  0.40  0.00  1.00  Science and medicine  Share of academic staff in the natural sciences and medicine  97  0.26  0.16  0.00  0.80  Social  Share of academic staff in the social sciences and business  97  0.11  0.07  0.00  0.47  Technical  Share of academic staff in technical and engineering subjects  97  0.12  0.06  0.00  0.31  Arts and humanities  Share of academic staff in the arts and humanities  97  0.13  0.10  0.00  0.58  England    97  0.80  0.40  0.00  1.00  Scotland    97  0.11  0.32  0.00  1.00  Northern Ireland    97  0.02  0.14  0.00  1.00  Variable name  Description  N  Mean  Standard Deviation  Minimum value  Maximum value  Public research funds per academic  Ratio of public funds for research to academic staff  97  6.64  5.39  0.34  18.99  Students per academic  Number of undergraduate and postgraduate students per academic  97  10.54  4.66  1.51  20.48  KTO age  Age of the KTO in years  97  14.68  8.84  0.00  41.95  Subject diversification  Subject diversity of academic staff  97  8.55  3.64  1.56  21.78  Regional GVA  Regional gross value added per capita  97  20919.87  6580.56  14842.00  34200.00  Regional BERD  Regional business expenditure on R&D  97  1306.88  1128.21  157.00  3948.00  City universities  Number of universities in the same city  97  9.88  15.47  1.00  40.00  Age  Age of the institution in years  97  145.05  137.53  19.00  915.00  Size  Total number of staff  97  2581.80  1867.65  302.00  9968.00  University hospital  Presence of a university hospital  97  0.20  0.40  0.00  1.00  Science and medicine  Share of academic staff in the natural sciences and medicine  97  0.26  0.16  0.00  0.80  Social  Share of academic staff in the social sciences and business  97  0.11  0.07  0.00  0.47  Technical  Share of academic staff in technical and engineering subjects  97  0.12  0.06  0.00  0.31  Arts and humanities  Share of academic staff in the arts and humanities  97  0.13  0.10  0.00  0.58  England    97  0.80  0.40  0.00  1.00  Scotland    97  0.11  0.32  0.00  1.00  Northern Ireland    97  0.02  0.14  0.00  1.00  View Large The results of the regressions on the INEFFICIENT variable are shown in columns (1a) and (1b) of Table 6. The results of the regressions on the efficiency scores are shown in columns (2a) and (2b) of Table 6. Table 6. Regressions on the INEFFICIENT indicator variable (columns 1a, 1b) and on the bias-corrected efficiency scores (columns 2a, 2b) using institutional and environmental factors Variables  (1a)  (1b)  (2a)  (2b)  Public research funds per academic  0.060  0.038  -0.092  -0.119    (0.219)  (0.261)  (0.159)  (0.133)  Students per academic  -0.159  -0.629  0.138  0.033    (0.347)  (0.626)  (0.230)  (0.184)  KTO age  0.132+  0.131  0.076*  0.059    (0.091)  (0.131)  (0.052)  (0.049)  Subject diversification  0.648+  4.794*  0.315*  1.796***    (0.416)  (2.497)  (0.197)  (0.743)  Subject diversification    -0.171**    -0.063**      (0.085)    (0.031)  Regional GVA  0.000  0.000  0.000  0.000    (0.000)  (0.000)  (0.000)  (0.000)  Regional BERD  0.000  -0.001  0.001***  0.001***    (0.001)  (0.001)  (0.000)  (0.000)  City universities  0.073  0.105  0.089+  0.099*    (0.116)  (0.147)  (0.076)  (0.066)  Age  -0.036**  -0.064**  -0.009*  -0.008**    (0.016)  (0.030)  (0.006)  (0.005)  Size  0.000  -0.001+  -0.002***  -0.001***    (0.000)  (0.001)  (0.000)  (0.000)  University hospital  -0.497  0.431  1.375+  1.388+    (1.450)  (3.018)  (1.233)  (1.095)  Science and medicine  8.153*  16.377**  5.597+  10.292**    (4.573)  (7.569)  (4.399)  (5.234)  Technical  -9.517  -4.161  -0.374  2.018    (7.846)  (8.655)  (5.932)  (6.070)  Social science  2.667  -19.420  21.412***  15.345**    (9.253)  (15.315)  (8.054)  (6.712)  England  4.137**  7.028*  -2.255*  -1.756+    (2.081)  (4.160)  (1.639)  (1.525)  Scotland  23.889  36.507  2.843*  2.645**    (28.540)  (42.723)  (1.802)  (1.564)  Northern Ireland  11.564  14.103  0.576  0.182    (105.421)  (158.325)  (4.743)  (7.573)  Intercept  -0.379  -13.548  -0.553  -7.920*    (6.630)  (9.507)  (4.707)  (5.796)  Observations  97  97  97  97  LR  31.68  41.34  63.85  71.02  Prob > chi2  0.011  0.001  0.000  0.000  Pseudo R2  0.574  0.715  0.649  0.670  Variables  (1a)  (1b)  (2a)  (2b)  Public research funds per academic  0.060  0.038  -0.092  -0.119    (0.219)  (0.261)  (0.159)  (0.133)  Students per academic  -0.159  -0.629  0.138  0.033    (0.347)  (0.626)  (0.230)  (0.184)  KTO age  0.132+  0.131  0.076*  0.059    (0.091)  (0.131)  (0.052)  (0.049)  Subject diversification  0.648+  4.794*  0.315*  1.796***    (0.416)  (2.497)  (0.197)  (0.743)  Subject diversification    -0.171**    -0.063**      (0.085)    (0.031)  Regional GVA  0.000  0.000  0.000  0.000    (0.000)  (0.000)  (0.000)  (0.000)  Regional BERD  0.000  -0.001  0.001***  0.001***    (0.001)  (0.001)  (0.000)  (0.000)  City universities  0.073  0.105  0.089+  0.099*    (0.116)  (0.147)  (0.076)  (0.066)  Age  -0.036**  -0.064**  -0.009*  -0.008**    (0.016)  (0.030)  (0.006)  (0.005)  Size  0.000  -0.001+  -0.002***  -0.001***    (0.000)  (0.001)  (0.000)  (0.000)  University hospital  -0.497  0.431  1.375+  1.388+    (1.450)  (3.018)  (1.233)  (1.095)  Science and medicine  8.153*  16.377**  5.597+  10.292**    (4.573)  (7.569)  (4.399)  (5.234)  Technical  -9.517  -4.161  -0.374  2.018    (7.846)  (8.655)  (5.932)  (6.070)  Social science  2.667  -19.420  21.412***  15.345**    (9.253)  (15.315)  (8.054)  (6.712)  England  4.137**  7.028*  -2.255*  -1.756+    (2.081)  (4.160)  (1.639)  (1.525)  Scotland  23.889  36.507  2.843*  2.645**    (28.540)  (42.723)  (1.802)  (1.564)  Northern Ireland  11.564  14.103  0.576  0.182    (105.421)  (158.325)  (4.743)  (7.573)  Intercept  -0.379  -13.548  -0.553  -7.920*    (6.630)  (9.507)  (4.707)  (5.796)  Observations  97  97  97  97  LR  31.68  41.34  63.85  71.02  Prob > chi2  0.011  0.001  0.000  0.000  Pseudo R2  0.574  0.715  0.649  0.670  Standard errors in parentheses *** p<0.01, ** p<0.05, * p<0.1, + p<0.15 View Large Table 6. Regressions on the INEFFICIENT indicator variable (columns 1a, 1b) and on the bias-corrected efficiency scores (columns 2a, 2b) using institutional and environmental factors Variables  (1a)  (1b)  (2a)  (2b)  Public research funds per academic  0.060  0.038  -0.092  -0.119    (0.219)  (0.261)  (0.159)  (0.133)  Students per academic  -0.159  -0.629  0.138  0.033    (0.347)  (0.626)  (0.230)  (0.184)  KTO age  0.132+  0.131  0.076*  0.059    (0.091)  (0.131)  (0.052)  (0.049)  Subject diversification  0.648+  4.794*  0.315*  1.796***    (0.416)  (2.497)  (0.197)  (0.743)  Subject diversification    -0.171**    -0.063**      (0.085)    (0.031)  Regional GVA  0.000  0.000  0.000  0.000    (0.000)  (0.000)  (0.000)  (0.000)  Regional BERD  0.000  -0.001  0.001***  0.001***    (0.001)  (0.001)  (0.000)  (0.000)  City universities  0.073  0.105  0.089+  0.099*    (0.116)  (0.147)  (0.076)  (0.066)  Age  -0.036**  -0.064**  -0.009*  -0.008**    (0.016)  (0.030)  (0.006)  (0.005)  Size  0.000  -0.001+  -0.002***  -0.001***    (0.000)  (0.001)  (0.000)  (0.000)  University hospital  -0.497  0.431  1.375+  1.388+    (1.450)  (3.018)  (1.233)  (1.095)  Science and medicine  8.153*  16.377**  5.597+  10.292**    (4.573)  (7.569)  (4.399)  (5.234)  Technical  -9.517  -4.161  -0.374  2.018    (7.846)  (8.655)  (5.932)  (6.070)  Social science  2.667  -19.420  21.412***  15.345**    (9.253)  (15.315)  (8.054)  (6.712)  England  4.137**  7.028*  -2.255*  -1.756+    (2.081)  (4.160)  (1.639)  (1.525)  Scotland  23.889  36.507  2.843*  2.645**    (28.540)  (42.723)  (1.802)  (1.564)  Northern Ireland  11.564  14.103  0.576  0.182    (105.421)  (158.325)  (4.743)  (7.573)  Intercept  -0.379  -13.548  -0.553  -7.920*    (6.630)  (9.507)  (4.707)  (5.796)  Observations  97  97  97  97  LR  31.68  41.34  63.85  71.02  Prob > chi2  0.011  0.001  0.000  0.000  Pseudo R2  0.574  0.715  0.649  0.670  Variables  (1a)  (1b)  (2a)  (2b)  Public research funds per academic  0.060  0.038  -0.092  -0.119    (0.219)  (0.261)  (0.159)  (0.133)  Students per academic  -0.159  -0.629  0.138  0.033    (0.347)  (0.626)  (0.230)  (0.184)  KTO age  0.132+  0.131  0.076*  0.059    (0.091)  (0.131)  (0.052)  (0.049)  Subject diversification  0.648+  4.794*  0.315*  1.796***    (0.416)  (2.497)  (0.197)  (0.743)  Subject diversification    -0.171**    -0.063**      (0.085)    (0.031)  Regional GVA  0.000  0.000  0.000  0.000    (0.000)  (0.000)  (0.000)  (0.000)  Regional BERD  0.000  -0.001  0.001***  0.001***    (0.001)  (0.001)  (0.000)  (0.000)  City universities  0.073  0.105  0.089+  0.099*    (0.116)  (0.147)  (0.076)  (0.066)  Age  -0.036**  -0.064**  -0.009*  -0.008**    (0.016)  (0.030)  (0.006)  (0.005)  Size  0.000  -0.001+  -0.002***  -0.001***    (0.000)  (0.001)  (0.000)  (0.000)  University hospital  -0.497  0.431  1.375+  1.388+    (1.450)  (3.018)  (1.233)  (1.095)  Science and medicine  8.153*  16.377**  5.597+  10.292**    (4.573)  (7.569)  (4.399)  (5.234)  Technical  -9.517  -4.161  -0.374  2.018    (7.846)  (8.655)  (5.932)  (6.070)  Social science  2.667  -19.420  21.412***  15.345**    (9.253)  (15.315)  (8.054)  (6.712)  England  4.137**  7.028*  -2.255*  -1.756+    (2.081)  (4.160)  (1.639)  (1.525)  Scotland  23.889  36.507  2.843*  2.645**    (28.540)  (42.723)  (1.802)  (1.564)  Northern Ireland  11.564  14.103  0.576  0.182    (105.421)  (158.325)  (4.743)  (7.573)  Intercept  -0.379  -13.548  -0.553  -7.920*    (6.630)  (9.507)  (4.707)  (5.796)  Observations  97  97  97  97  LR  31.68  41.34  63.85  71.02  Prob > chi2  0.011  0.001  0.000  0.000  Pseudo R2  0.574  0.715  0.649  0.670  Standard errors in parentheses *** p<0.01, ** p<0.05, * p<0.1, + p<0.15 View Large The variables’ signs and significance are broadly similar across both models. The regressions under Model 1 report generally weaker correlations, due to the relatively lower variability in the dependent variable and particularly the fact that only a few universities are efficient. The variables Public research funds per academic staff and Students per academic staff are not significant. This supports the previous conjecture that, when the broader model of knowledge transfer is considered, opportunities for knowledge transfer arise from both research and teaching activities, and research intensity is not a strong requirement to achieve efficiency.11 Universities that have lower research intensity can achieve efficient performance in delivering a broad set of knowledge transfer outputs, while this would not be the case if only a narrow set of outputs was considered. Having an older KTO increases inefficiency (contrary to the findings by Curi et al. [2012] on the French university system). Empirical studies have suggested that what improves the performance of KTOs are the professionalisation and qualifications of its staff as well as the adoption of supportive policies and procedures (Siegel et al., 2003; Debackere and Veugelers, 2005; Caldera and Debande, 2010), rather than the simple establishment of a KTO. Universities whose KTOs are older may even be disadvantaged to the extent that the presence of an older structure may discourage experimentation with new, more appropriate policies and practices. Although universities with a diversified subject profile were expected to enjoy greater opportunities for knowledge transfer, the results instead suggest that a higher degree of subject diversification is associated with greater likelihood to be inefficient. This may be due to more specialised universities enjoying greater prestige in their subject areas and thus being able to command higher income from (and greater opportunities for) knowledge transfer. It may also suggest that the skills and structures needed to support knowledge transfer are subject specific, so that supporting knowledge transfer in a range of subjects simultaneously does not generate positive synergies, while being able to focus resources in one subject area generates greater returns. To shed further light on this relationship, in models (1b) and (2b) we include the square of Subject diversification. The sign and significance of the coefficient suggest that the relationship between subject diversification and inefficiency is inverse-U-shaped, with both very specialised and very diversified universities being less inefficient. The level of BERD and number of universities in the same city significantly increase inefficiency. The positive sign of the latter variable is probably due to increased competition, both for good-quality inputs and for knowledge transfer opportunities, among institutions that are close to each other. The effect of the level of BERD contrasts with findings from other studies that have found that higher BERD increases efficiency. This suggests that when knowledge transfer is not considered only in terms of patent commercialisation, the external stakeholders involved go beyond R&D-intensive firms, and regions with a different composition of economic activity may also offer opportunities for knowledge transfer. In terms of control variables, older and larger universities are less likely to be inefficient, in line with other studies. These universities may enjoy greater opportunities for knowledge transfer thanks to their prestige, visibility and reputation. The presence of a university hospital has a weakly positive effect on the university’s inefficiency, consistent with findings from other studies that use European data (Chapple et al., 2005; Anderson et al., 2007; Curi et al., 2012). A possible interpretation is that knowledge transfer outputs arising from medical research performed in university hospitals are assigned to the hospital directly, and go unreported by the university. 5. Conclusions This study has several implications for university management and for policymakers in the science, technology, innovation and education policy domains. In terms of management implications, the findings suggest that universities with different production models can be equally efficient in generating knowledge transfer outputs, and that research intensity is not a prerequisite for efficiency. Universities can achieve efficiency by adopting a model of knowledge transfer engagement that is consistent with their resources (their subject specialisation, their research and teaching intensity, their knowledge transfer capabilities and infrastructures) without needing to replicate the knowledge transfer strategies of prominent institutions whose resources may be very different. Managers should therefore invest effort in identifying the knowledge transfer engagement strategies that best fit the institution’s resources, and aim to increase their performance in those activities. This approach may also improve the university’s reputation for excellence in specific knowledge transfer activities, which in turn may increase their ability to generate further knowledge transfer outputs. In fact, institutional reputation appears to increase knowledge transfer opportunities, with more reputable older, larger and very diversified institutions achieving greater efficiency. Another management implication is that, rather than having an established KTO, what matters for efficiency are its practices and policies, and the professionalisation of its staff. KTOs therefore need to invest in staff training and in the development of best practices. Developing specialised, subject-specific skills and structures to support knowledge transfer, rather than generic ones, may also pay off, as suggested by the finding that more specialised universities are more efficient. In terms of policy implications, the findings suggest that performance assessment systems need to adopt a broader definition of knowledge transfer. With a broader approach, more universities achieve efficiency, mean efficiency is higher and the distribution of efficiency scores is less skewed. The universities that improve their efficiency typically have more arts and humanities staff, and less staff in the natural sciences and medicine; they also tend to be less research and more teaching intensive. Therefore, systems of performance measurement that rely on a narrow definition of knowledge transfer tend to underestimate the efficiency of universities that do not follow the standard technology transfer model. In this vein, a criticism moved to the UK’s approach to knowledge transfer funding concerns the use of income as a measure of performance. This approach rewards universities whose knowledge transfer outputs are best measured in monetary terms, but this is unlikely to suit all university institutions, for example those whose prevailing form of knowledge transfer is public engagement. As our analysis shows, measuring performance with different outputs produces different efficiency rankings. As the understanding of the complexity of universities’ knowledge transfer processes grows, the development of systems to assess their performance should also increasingly reflect this complexity, and allow for a more comprehensive assessment of knowledge transfer engagement. This is not easy to achieve. Although adopting a broader model of knowledge transfer increases comparability among universities with different knowledge transfer strategies, methodological and data availability issues impose several constraints: the efficiency analysis could be performed only on the 97 universities that use positive quantities of all the inputs and outputs considered (which led to the exclusion of certain types of institutions, such as most higher-education colleges), and some knowledge transfer outputs were left out for data availability reasons. One way to achieve greater comparability of the assessed units would be to carry out efficiency analyses at departmental level, but few knowledge transfer data are systematically collected from university departments. Another approach might be to group universities according to the model of knowledge transfer they adopt, and then compare the relative performance of universities within each group (considering appropriate input and output mixes), rather than rate all universities against all others. Finally, while performance is often measured by looking at outputs, thinking about performance in terms of efficiency helps us recognise that universities work with very different resource mixes, which affects the nature of their knowledge transfer engagement. Changes in the resources available to universities, for example because of changes in the rules governing the allocation of public funds, will also change their ability to engage in knowledge transfer. Therefore, policymakers need to think systematically about the effect of changes in funding for research and teaching (for example, the replacement of recurrent grants with competitive funding) on the universities’ ability to engage in knowledge transfer. The relationship between nature of funding sources and knowledge transfer strategies, which so far has been largely unexplored, would merit greater attention from both researchers and policymakers. Bibliography Agasisti, T. and Bonomi, F. 2013. Benchmarking universities’ efficiency indicators in the presence of internal heterogeneity, Studies in Higher Education , doi: 10.1080/03075079.2013.801423 Aigner, D. J., Lovell, C. A. K. and Schmidt, P. 1977. Formulation and estimation of stochastic frontier production function models, Journal of Econometrics , vol. 6, 21– 37 Google Scholar CrossRef Search ADS   Anderson, T. R., Daim, T. U. and Lavoie, F. F. 2007. Measuring the efficiency of university technology transfer, Technovation , vol. 27, 306– 18 Google Scholar CrossRef Search ADS   Antonelli, C. 2008. The new economics of the university: a knowledge governance approach, Journal of Technology Transfer , vol. 33, 1– 22 Google Scholar CrossRef Search ADS   Arundel, A. and Geuna, A. 2004. Proximity and the use of public science by innovative European firms, Economics of Innovation and New Technology , vol. 13, 559– 80 Google Scholar CrossRef Search ADS   Association of University Technology Managers. 2014. AUTM Licensing Activity Survey: FY2014, http://www.autm.net/resources-surveys/research-reports-databases/licensing-surveys/fy-2014-licensing-survey/, last accessed October 2016 Battese, G. E. and Coelli, T. J. 1995. A model for technical inefficiency effects in a stochastic production function for panel data, Empirical Economics , vol. 20, 325– 32 Google Scholar CrossRef Search ADS   Bekkers, R. and Bodas Freitas, I. M. 2008. Analysing preferences for knowledge transfer channels between universities and industry: to what degree do sectors also matter?, Research Policy , vol. 37, 1837– 53 Google Scholar CrossRef Search ADS   Belenzon, S. and Schankerman, M. 2009. University knowledge transfer: private ownership, incentives, and local development objectives, Journal of Law and Economics , vol. 52, no. 1, 111– 44 Google Scholar CrossRef Search ADS   Berbegal Mirabent, J., Lafuente, E. and Solé Parellada, F. 2013. The pursuit of knowledge transfer activities: an efficiency analysis of Spanish universities, Journal of Business Research , vol. 66, no. 10, 2051– 9 Berbegal Mirabent, J. and Solé Parellada, F. 2012. What are we measuring when evaluating universities’ efficiency?, Regional and Sectoral Economics Studies  vol. 12, no. 3, 31– 46 Bercovitz, J., Feldman, M., Feller, I. and Burton, R. 2001. Organisational structure as determinants of academic patent and licensing behavior: an exploratory study of Duke, Johns Hopkins, and Pennsylvania State Universities, Journal of Technology Transfer , vol. 26, no. 1–2, 21– 35 Google Scholar CrossRef Search ADS   Bourelos, E., Magnusson, M. and McKelvey, M. 2012. Investigating the complexity facing academic entrepreneurs in science and engineering: the complementarities of research performance, networks and support structures in commercialisation, Cambridge Journal of Economics , vol. 36, 751– 80 Google Scholar CrossRef Search ADS   Bozeman, B. 2000. Technology transfer and public policy: a review of research and theory, Research Policy , vol. 29, 627– 55 Google Scholar CrossRef Search ADS   Bruno, G. S. F. and Orsenigo, L. 2003. Variables influencing industrial funding of academic research in Italy: an empirical analysis, International Journal of Technology Management , vol. 26, 277– 302 Google Scholar CrossRef Search ADS   Bulut, H. and Moschini, G. 2006. ‘US Universities’ Net Returns from Patenting and Licensing: A Quantile Regression Analysis’ , Working Paper 06-WP 432, Center for Agricultural and Rural Development, Iowa State University, September Caldera, A. and Debande, O. 2010. Performance of Spanish universities in technology transfer: an empirical analysis, Research Policy , vol. 39, no. 9, 1160– 73 Google Scholar CrossRef Search ADS   Castro-Martínez, E., Molas-Gallart, J. and Olmos-Peñuela, J. 2010. ‘Knowledge Transfer in the Social Sciences and the Humanities: Informal Links in a Public Research Organisation’ , INGENIO (CSIC-UPV) Working Papers Series 2010– 12 Chapple, W., Lockett, A., Siegel, D. S. and Wright, M. 2005. Assessing the relative performance of university technology transfer office in UK: parametric and non-parametric evidence, Research Policy , vol. 34, no. 3, 369– 84 Google Scholar CrossRef Search ADS   Charnes, A., Cooper, W., Lewin, A. and Seiford, L. 1994. Data Envelopment Analysis: Theory, Methodology and Applications , Boston, Kluwer Academic Publishers Google Scholar CrossRef Search ADS   Charnes, A., Cooper, W. W. and Rhodes, E. 1978. Measuring the inefficiency of decision making units, European Journal of Operational Research , vol. 2, 429– 44 Google Scholar CrossRef Search ADS   Chiang, K. and His-Tai, H. 2006. Efficiency analysis of university departments: an empirical study, Omega—The International Journal of Management Science , vol. 36, 653– 64 Cohen, W. M., Nelson, R. R. and Walsh, J. P. 2002. Links and impacts: the influence of public research on industrial R&D, Management Science , vol. 48, 1– 23 Google Scholar CrossRef Search ADS   Curi, C., Daraio, C. and Llerena, P. 2012. University technology transfer: how, (in)efficient are French universities?, Cambridge Journal of Economics , vol. 36, 629– 54 Google Scholar CrossRef Search ADS   D’Este, P. and Patel, P. 2007. University–industry linkages in the UK: what are the factors underlying the variety of interactions with industry?, Research Policy , vol. 36, 1295– 1313 Google Scholar CrossRef Search ADS   De Silva, L. R., Uyarra, E. and Oakey, R. 2012. Academic entrepreneurship in a resource constrained environment: diversification and synergistic effects, in Audretsch, D. B., Lehmann, E. E., Link, A. N., Starnecker, A. and Audretsch, D. (eds.), Technology Transfer in a Global Economy , New York, Springer Debackere, K. and Veugelers, R. 2005. The role of academic technology transfer organisations in improving industry science links, Research Policy , vol. 34, no. 3, 321– 42 Google Scholar CrossRef Search ADS   Deiaco, E., Hughes, A. and McKelvey, M. 2012. Universities as strategic actors in the knowledge economy, Cambridge Journal of Economics , vol. 36, 525– 41 Google Scholar CrossRef Search ADS   Deprins, D., Simar, L. and Tulkens, H. 1984. Measuring labour inefficiency in post offices, pp. 243– 67 in Marchand, M., Pestieau, P. and Tulkens, H. (eds.), The Performance of Public Enterprises: Concepts and Measurements , Amsterdam, North-Holland Drucker, P. 1977. An Introductory View of Management , New York, Harper College Press Etzkowitz, H. 2002. Incubation of incubators: innovation as a triple helix of university–industry–government networks, Science and Public Policy , vol. 29, 115– 28 Google Scholar CrossRef Search ADS   Friedman, J. and Silberman, J. 2003. University technology transfer: do incentives, management and location matter?, Journal of Technology Transfer , vol. 28, 17– 30 Google Scholar CrossRef Search ADS   Geuna, A. and Rossi, F. 2011. Changes to university IPR regulations in Europe and the impact on academic patenting, Research Policy , vol. 40, 1068– 76 Google Scholar CrossRef Search ADS   Grady, R. and Pratt, J. 2000. The UK technology transfer system: calls for stronger links between higher education and industry, Journal of Technology Transfer , vol. 25, no. 2, 205– 11 Google Scholar CrossRef Search ADS   Griffin, R. W. 1987. Management  ( 2nd ed.), Boston, Houghton Mifflin Co. Guerrero, M., Cunningham, J. A. and Urbano, D. 2015. Economic impact of entrepreneurial universities’ activities: an exploratory study of the United Kingdom, Research Policy , vol. 44, 748– 64 Google Scholar CrossRef Search ADS   Harabi, N. 1995. Appropriability of technical innovations: an empirical analysis, Research Policy , vol. 24, 981– 92 Google Scholar CrossRef Search ADS   Henderson, R., Jaffe, A. and Trajtenberg, M. 1998. Universities as a source of commercial technology: a detailed analysis of university patenting, 1965–1988, Review of Economics and Statistics , vol. 80, 119– 27 Google Scholar CrossRef Search ADS   Hewitt-Dundas, N. 2012. Research intensity and knowledge transfer activity in UK universities, Research Policy , vol. 41, 262– 75 Google Scholar CrossRef Search ADS   Higher Education Funding Council for England. 2015. Guide to Funding 2015-6: How HEFCE Allocates its Funds, HEFCE, March 2015/04. Ho, M. H.-C., Liu, J. S., Lu, W.-M. and Huang, C.-C. 2014. A new perspective to explore the technology transfer efficiencies in US universities, Journal of Technology Transfer , vol. 39, no. 2, 247– 75 Google Scholar CrossRef Search ADS   Hughes, A. and Kitson, M. 2012. Pathways to impact and the strategic role of universities: new evidence on the breadth and depth of university knowledge exchange in the UK and the factors constraining its development, Cambridge Journal of Economics , vol. 36, 723– 50 Google Scholar CrossRef Search ADS   Jensen, P. H., Palangkaraya, A. and Webster, E. 2009. ‘A Guide to Metrics on Knowledge Transfer from Universities to Businesses and Industry in Australia’ , Intellectual Property Research Institute of Australia Occasional Paper No. 03/09 Johnes, G. and Johnes, J. 1993. ‘Measuring the Research Performance of UK Economics Departments: An Application of Data Envelopment Analysis’ , Oxford Economic Papers 45. Jones-Evans, J. and Klofsten, M. 2000. Comparing academic entrepreneurship in Europe—the case of Sweden and Ireland, Small Business Economics , vol. 14, 299– 309 Google Scholar CrossRef Search ADS   Ken, Y., Huang, T., Wu, C. H. and Shiu, S. H. 2009. Super-Efficiency DEA Model for Evaluating Technology Transfer Performance of US University , paper presented at the Portland International Conference on Management of Engineering & Technology (PICMET), August 2–6, in Portland, OR, USA Kim, J., Anderson, T. and Daim, T. 2008. Assessing university technology transfer: a measure of efficiency patterns, International Journal of Innovation and Technology Management , vol. 5, no. 4, 495– 526 Google Scholar CrossRef Search ADS   Lach, S. and Schankerman, M. 2004. Royalty sharing and technology licensing in universities, Journal of the European Economic Association , vol. 2, no. 2–3, 252– 64 Google Scholar CrossRef Search ADS   Landry, R., Amara, N. and Ouimet, M. 2007. Determinants of knowledge transfer: evidence from Canadian university researchers in natural sciences and engineering, Journal of Technology Transfer , vol. 32, no. 6, 561– 92 Google Scholar CrossRef Search ADS   Laursen, K. and Salter, A. 2004. Searching low and high: what types of firms use universities as a source of innovation?, Research Policy , vol. 33, 1201– 15 Google Scholar CrossRef Search ADS   Lawton Smith, H. 2007. Universities, innovation, and territorial development: a review of the evidence, Environment and Planning C: Government and Policy , vol. 25, no. 1, 98– 114 Google Scholar CrossRef Search ADS   Levin, R. 1986. A new look at the patent system, American Economic Review , vol. 76, no. 2, 199– 202 Levin, R. C., Klevorick, A. K., Nelson, R. R. and Winter, S. G. 1987. Appropriating the returns from industrial research and development, Brookings Papers on Economic Activity , vol. 3, 783– 831 Google Scholar CrossRef Search ADS   Link, A. N. and Siegel, D. S. 2005. Generating science-based growth: an econometric analysis of the impact of organisational incentives on university–industry technology transfer, European Journal of Finance , vol. 11, no. 3, 169– 82 Google Scholar CrossRef Search ADS   Lockett, A. and Wright, M. 2005. Resources, capabilities, risk capital and the creation of university spin-out companies, Research Policy , vol. 34, 1043– 57 Google Scholar CrossRef Search ADS   Lockett, A., Wright, M. and Wild, A. 2014. The institutionalisation of third stream activities in UK higher education: the role of discourse and metrics, British Journal of Management , vol. 26, 78– 92 Google Scholar CrossRef Search ADS   Meeusen, W. and Van den Broeck, J. 1977. Efficiency estimation from Cobb–Douglas production functions with composed errors, International Economic Review , vol. 18, 435– 44 Google Scholar CrossRef Search ADS   Meyer-Krahmer, F. and Schmock, U. 1998. Science-based technologies: university–industry interactions in four fields, Research Policy , vol. 27, 835– 51 Google Scholar CrossRef Search ADS   Montesinos, P., Carot, J. M., Martinez, J. M. and Mora, F. 2008. Third mission ranking for world class universities: beyond teaching and research, Higher Education in Europe , vol. 33, no. 2, 259– 71 Google Scholar CrossRef Search ADS   Mowery, D. and Sampat, B. 2005. Universities in national innovation systems, pp. 209– 39 in Fagerberg, J., Mowery, D. and Nelson, R. (eds.), The Oxford Handbook of Innovation , Oxford, Oxford University Press Google Scholar CrossRef Search ADS   Nelles, J. and Vorley, T. 2010. From policy to practice: engaging and embedding the third mission in contemporary universities, International Journal of Sociology and Social Policy , vol. 30, no. 7–8, 341– 53 Google Scholar CrossRef Search ADS   O’Shea, R. P., Allen, T. J., Chevalier, A. and Roche, F. 2005. Entrepreneurial orientation, technology transfer and spinoff performance of U.S. universities, Research Policy , vol. 34, no. 7, 994– 1009 Google Scholar CrossRef Search ADS   Oliveira, M. D. and Teixeira, A. C. 2010. ‘The Determinants of Technology Transfer Efficiency and the Role of Innovation Policies: A Survey’ , FEP Working Papers 375, Universidade do Porto, Faculdade de Economia do Porto Olmos-Penuela, J., Castro-Martinez, E. and D’Este, P. 2011. Knowledge Transfer Activities in Humanities and Social Sciences: Which Determinants Explain Research Group Interactions with Non-Academic Agents? , paper presented at the Dime-Druid Academy Winter Conference Perkmann, M., Neely, A. and Walsh, K. 2011. How should firms evaluate success in university-industry alliances? A performance measurement system, R&D Management , vol. 41, 202– 16 Google Scholar CrossRef Search ADS   Perkmann, M. and Walsh, K. 2007. University–industry relationships and open innovation: towards a research agenda, International Journal of Management Reviews , vol. 9, no. 4, 259– 80 Google Scholar CrossRef Search ADS   Pinheiro, R., Langa, P. V. and Pausits, A. 2015. The institutionalisation of universities’ third mission: introduction to the special issue, European Journal of Higher Education , vol. 5, no. 3, 227– 32 Google Scholar CrossRef Search ADS   Ranga, M. and Etzkowitz, H. 2013. Triple Helix systems: an analytical framework for innovation policy and practice in the Knowledge Society, Industry and Higher Education , vol. 27, 237– 62 Google Scholar CrossRef Search ADS   Ray, S. C. 2004. Data Envelopment Analysis: Theory and Techniques for Economics and Operations Research , Cambridge University Press, Cambridge Google Scholar CrossRef Search ADS   Research Councils UK. 2007. ‘Knowledge Transfer Categorisation and Harmonisation Project’ , Final Report, Swindon, Research Councils UK Rogers, E., Yin, J. and Hoffmann, J. 2000. Assessing the effectiveness of technology transfer offices at U.S. research universities, Journal of the Association of University Technology Managers , vol. 12, 47– 80 Rosli, A. and Rossi, F. 2015. Assessing the impact of knowledge transfer policies: an international comparison of models and indicators of universities’ knowledge transfer performance, in Hilpert, U. (ed.), Handbook on Politics and Technology , London, Routledge Rossi, F. and Rosli, A. 2015. Indicators of university-industry knowledge transfer performance and their implications for universities: evidence from the United Kingdom, Studies in Higher Education , vol. 40, no. 10, 1970– 91 Google Scholar CrossRef Search ADS   Sánchez-Barrioluengo, M. 2014. Articulating the ‘three-missions’ in Spanish universities, Research Policy , vol. 43, no. 10, 1760– 73 Google Scholar CrossRef Search ADS   Schartinger, D., Schibany, A. and Gassler, H. 2001. Interactive relations between universities and firms: empirical evidence for Austria, Journal of Technology Transfer , vol. 26, no. 3, 255– 69 Google Scholar CrossRef Search ADS   Siegel, D. S., Veugelers, R. and Wright, M. 2007. Technology transfer offices and commercialisation of university intellectual property: performance and policy implications, Oxford Review of Economic Policy , vol. 23, 640– 60 Google Scholar CrossRef Search ADS   Siegel, D. S., Waldman, D. and Link, A. 2003. Assessing the impact of organisational practices on the relative productivity of university technology transfer offices: an exploratory study, Research Policy , vol. 32, 27– 48 Google Scholar CrossRef Search ADS   Simar, L. and Wilson, P. W. 2000. A general methodology for bootstrapping in nonparametric frontier models, Journal of Applied Statistics , vol. 27, 779– 802 Google Scholar CrossRef Search ADS   Simar, L. and Wilson, P. W. 2007. Estimation and inference in two-stage, semi-parametric models of production processes, Journal of Econometrics , vol. 136, 31– 64 Google Scholar CrossRef Search ADS   Simar, L. and Wilson, P. W. 2011. Inference by the m out of n bootstrap in nonparametric frontier models, Journal of Productivity Analysis , vol. 36, 33– 53 Google Scholar CrossRef Search ADS   Sorensen, J. and Chambers, D. 2008. Evaluating academic technology transfer performance by how well access to knowledge is facilitated—defining an access metric, Journal of Technology Transfer , vol. 33, no. 5, 534– 47 Google Scholar CrossRef Search ADS   Thursby, J. G. and Kemp, S. 2002. Growth and productive efficiency of university intellectual property licensing, Research Policy , vol. 31, 109– 24 Google Scholar CrossRef Search ADS   Thursby, J. G. and Thursby, M. C. 2002. Who is selling the Ivory Tower? Sources of growth in university licensing, Management Science , vol. 48, no. 1, 90– 104 Google Scholar CrossRef Search ADS   Von Tunzelmann, N. and Kraemer Mbula, E. 2003. ‘ Changes in Research Assessment Practices in Other Countries since 1999’, HEFCE commissioned report. SPRU University of Sussex Vorley, T. and Lawton Smith, H. 2007. Universities and the knowledge-based economy, Environment and Planning C: Government and Policy , vol. 25, no. 6, 775– 78 Google Scholar CrossRef Search ADS   Wright, M., Clarysse, B., Lockett, A. and Knockaert, M. 2008. Mid-range universities’ linkages with industry: knowledge types and the role of intermediaries, Research Policy , vol. 37, no. 8, 1205– 23 Google Scholar CrossRef Search ADS   Footnotes 1 Dedicated public funds for knowledge transfer engagement, where they exist, are usually small compared to public funding for research and teaching; for example, in England in 2014/15, funding allocated to support universities’ knowledge transfer activities (160 million GBP) amounted to about 5% of the recurrent grants allocated for teaching and research (3.1 billion GDP) (Higher Education Funding Council for England, 2015). 2 These studies have been identified based on the reviews by Berbegal Mirabent and Solé Parellada (2012), Oliveira and Teixeira (2010) and Siegel et al. (2007), and a further search for articles on Google Scholar using the keywords ‘university’ AND ‘efficiency’ AND ‘technology transfer’ OR ‘knowledge transfer’. 3 Consultancy income is a problematic variable since university policies vary: when academics are allowed to engage in private consulting, at least to some extent, the related incomes are unreported, leading to an underestimation of the real amount of consulting performed. This problem is mitigated by the fact that, in recent years, most universities have simultaneously tightened the rules on academic consulting and increased the incentives for academics to declare their consulting activities to the university. Providing consulting activities through the universities rather than privately has been made safer (most universities offer free indemnity insurance policies) and easier (most universities now have a contracting system for staff consulting activities; see Perkmann and Walsh, 2007). 4 I am grateful to an anonymous reviewer for this remark. 5 Since DEA scores are robust with respect to the inclusion of highly correlated inputs (as it is the case with all the inputs used in this study), the results would not change substantially if lagged inputs were used instead (Anderson et al., 2007). 6 The output-oriented approach is appropriate because universities are more interested in maximizing knowledge transfer outputs than in minimizing the inputs used in the knowledge transfer production process: in fact, most inputs are concurrently deployed in the production of research and teaching, so, for the purpose of knowledge transfer, they can be considered as exogenously determined and (almost) fixed in the short term. 7 For both the narrow and the broad models, the null hypothesis of constant returns to scale versus the alternative hypothesis of variable returns to scale was rejected at the 5% significance level (according to the test statistic proposed by Simar and Wilson [2011]), while the null hypothesis of non-increasing returns to scale versus the alternative hypothesis of variable returns to scale could not be rejected at this significance level. These tests were implemented using the rts.test routine included in the package rDEA for R-Cran. 8 It is not possible to directly compare the universities’ efficiency scores under the two models, as their magnitude is only meaningful in a relative sense. 9 This variable has been constructed to represent inefficiency, rather than efficiency, for consistency with our analysis of efficiency scores, whose bias-correction procedure has been developed using the Farrell-Debreu distance measure (Simar and Wilson, 2007). 10 This has been implemented using the dea.env.robust routine included in the package rDEA for R-Cran. 11 As a robustness check, the same variables in models (2a) and (2b) have been regressed on the efficiency scores computed using the narrower set of inputs (IP disclosures and research contracts): in this case, research intensity significantly reduces inefficiency (in line with previous studies that have considered a narrower model of knowledge transfer), while teaching intensity has a positive, but not significant, relationship with inefficiency. Appendix Appendix A. Correlation matrix between inputs and outputs   Research and teaching funding  Knowledge transfer staff  Academic staff  Income  intellectual property disclosures  Public engagement  Research and teaching funding  1.000            Knowledge transfer staff  0.292  1.000          Academic staff  0.913  0.255  1.000        Income  0.767  0.299  0.893  1.000      Intellectual property disclosures  0.677  0.193  0.806  0.821  1.000    Public engagement  0.489  0.074  0.541  0.341  0.473  1.000    Research and teaching funding  Knowledge transfer staff  Academic staff  Income  intellectual property disclosures  Public engagement  Research and teaching funding  1.000            Knowledge transfer staff  0.292  1.000          Academic staff  0.913  0.255  1.000        Income  0.767  0.299  0.893  1.000      Intellectual property disclosures  0.677  0.193  0.806  0.821  1.000    Public engagement  0.489  0.074  0.541  0.341  0.473  1.000  View Large Appendix A. Correlation matrix between inputs and outputs   Research and teaching funding  Knowledge transfer staff  Academic staff  Income  intellectual property disclosures  Public engagement  Research and teaching funding  1.000            Knowledge transfer staff  0.292  1.000          Academic staff  0.913  0.255  1.000        Income  0.767  0.299  0.893  1.000      Intellectual property disclosures  0.677  0.193  0.806  0.821  1.000    Public engagement  0.489  0.074  0.541  0.341  0.473  1.000    Research and teaching funding  Knowledge transfer staff  Academic staff  Income  intellectual property disclosures  Public engagement  Research and teaching funding  1.000            Knowledge transfer staff  0.292  1.000          Academic staff  0.913  0.255  1.000        Income  0.767  0.299  0.893  1.000      Intellectual property disclosures  0.677  0.193  0.806  0.821  1.000    Public engagement  0.489  0.074  0.541  0.341  0.473  1.000  View Large Appendix B. Correlation matrix between regressors   Public research funds per academic  Students per academic  TTO age  Subject diversification  Regional GVA  Regional BERD  City universities  Age  Size  University hospital  Science and medicine  Technical  Social sciences  Arts and humanities  England  Scotland  Northern Ireland  Wales  Public research funds per academic  1.00                                    Students per academic  -0.85  1.00                                  TTO age  0.12  -0.19  1.00                                Subject diversification  -0.65  0.80  -0.14  1.00                              Regional GVA  0.14  -0.11  -0.11  -0.31  1.00                            Regional BERD  0.08  -0.16  0.15  -0.02  0.01  1.00                          City universities  0.18  -0.11  -0.13  -0.31  0.92  -0.11  1.00                        Age  0.41  -0.38  0.16  -0.32  0.01  0.14  -0.03  1.00                      Size  0.51  -0.48  0.19  -0.36  -0.06  0.14  -0.09  0.44  1.00                    University hospital  0.38  -0.41  0.20  -0.38  0.15  0.09  0.14  0.25  0.41  1.00                  Science and medicine  0.56  -0.65  0.10  -0.65  0.21  0.00  0.20  0.35  0.46  0.53  1.00                Technical  -0.12  0.02  0.29  0.11  -0.11  0.08  -0.15  -0.15  0.01  -0.16  -0.29  1.00              Social sciences  -0.44  0.41  -0.03  0.35  -0.06  0.00  -0.04  -0.23  -0.23  -0.30  -0.47  0.22  1.00            Arts and humanities  -0.13  0.19  -0.17  0.02  0.04  -0.02  0.10  -0.04  -0.23  -0.19  -0.54  -0.34  -0.09  1.00          England  -0.04  0.08  0.02  0.07  0.24  0.42  0.24  -0.11  0.09  0.18  0.00  -0.04  0.10  0.00  1.00        Scotland  0.06  -0.09  -0.05  -0.07  -0.06  -0.27  -0.17  0.21  -0.07  -0.18  0.02  0.08  -0.07  -0.08  -0.§72  1.00      Northern Ireland  0.11  -0.05  -0.05  -0.02  -0.11  -0.15  -0.06  -0.04  0.05  -0.07  0.04  0.07  -0.02  -0.06  -0.29  -0.05  1.00    Wales  -0.05  0.22  0.06  -0.00  -0.24  -0.25  -0.14  -0.06  -0.09  -0.02  -0.04  -0.09  -0.05  0.14  -0.52  -0.09  -0.04  1.00    Public research funds per academic  Students per academic  TTO age  Subject diversification  Regional GVA  Regional BERD  City universities  Age  Size  University hospital  Science and medicine  Technical  Social sciences  Arts and humanities  England  Scotland  Northern Ireland  Wales  Public research funds per academic  1.00                                    Students per academic  -0.85  1.00                                  TTO age  0.12  -0.19  1.00                                Subject diversification  -0.65  0.80  -0.14  1.00                              Regional GVA  0.14  -0.11  -0.11  -0.31  1.00                            Regional BERD  0.08  -0.16  0.15  -0.02  0.01  1.00                          City universities  0.18  -0.11  -0.13  -0.31  0.92  -0.11  1.00                        Age  0.41  -0.38  0.16  -0.32  0.01  0.14  -0.03  1.00                      Size  0.51  -0.48  0.19  -0.36  -0.06  0.14  -0.09  0.44  1.00                    University hospital  0.38  -0.41  0.20  -0.38  0.15  0.09  0.14  0.25  0.41  1.00                  Science and medicine  0.56  -0.65  0.10  -0.65  0.21  0.00  0.20  0.35  0.46  0.53  1.00                Technical  -0.12  0.02  0.29  0.11  -0.11  0.08  -0.15  -0.15  0.01  -0.16  -0.29  1.00              Social sciences  -0.44  0.41  -0.03  0.35  -0.06  0.00  -0.04  -0.23  -0.23  -0.30  -0.47  0.22  1.00            Arts and humanities  -0.13  0.19  -0.17  0.02  0.04  -0.02  0.10  -0.04  -0.23  -0.19  -0.54  -0.34  -0.09  1.00          England  -0.04  0.08  0.02  0.07  0.24  0.42  0.24  -0.11  0.09  0.18  0.00  -0.04  0.10  0.00  1.00        Scotland  0.06  -0.09  -0.05  -0.07  -0.06  -0.27  -0.17  0.21  -0.07  -0.18  0.02  0.08  -0.07  -0.08  -0.§72  1.00      Northern Ireland  0.11  -0.05  -0.05  -0.02  -0.11  -0.15  -0.06  -0.04  0.05  -0.07  0.04  0.07  -0.02  -0.06  -0.29  -0.05  1.00    Wales  -0.05  0.22  0.06  -0.00  -0.24  -0.25  -0.14  -0.06  -0.09  -0.02  -0.04  -0.09  -0.05  0.14  -0.52  -0.09  -0.04  1.00  View Large Appendix B. Correlation matrix between regressors   Public research funds per academic  Students per academic  TTO age  Subject diversification  Regional GVA  Regional BERD  City universities  Age  Size  University hospital  Science and medicine  Technical  Social sciences  Arts and humanities  England  Scotland  Northern Ireland  Wales  Public research funds per academic  1.00                                    Students per academic  -0.85  1.00                                  TTO age  0.12  -0.19  1.00                                Subject diversification  -0.65  0.80  -0.14  1.00                              Regional GVA  0.14  -0.11  -0.11  -0.31  1.00                            Regional BERD  0.08  -0.16  0.15  -0.02  0.01  1.00                          City universities  0.18  -0.11  -0.13  -0.31  0.92  -0.11  1.00                        Age  0.41  -0.38  0.16  -0.32  0.01  0.14  -0.03  1.00                      Size  0.51  -0.48  0.19  -0.36  -0.06  0.14  -0.09  0.44  1.00                    University hospital  0.38  -0.41  0.20  -0.38  0.15  0.09  0.14  0.25  0.41  1.00                  Science and medicine  0.56  -0.65  0.10  -0.65  0.21  0.00  0.20  0.35  0.46  0.53  1.00                Technical  -0.12  0.02  0.29  0.11  -0.11  0.08  -0.15  -0.15  0.01  -0.16  -0.29  1.00              Social sciences  -0.44  0.41  -0.03  0.35  -0.06  0.00  -0.04  -0.23  -0.23  -0.30  -0.47  0.22  1.00            Arts and humanities  -0.13  0.19  -0.17  0.02  0.04  -0.02  0.10  -0.04  -0.23  -0.19  -0.54  -0.34  -0.09  1.00          England  -0.04  0.08  0.02  0.07  0.24  0.42  0.24  -0.11  0.09  0.18  0.00  -0.04  0.10  0.00  1.00        Scotland  0.06  -0.09  -0.05  -0.07  -0.06  -0.27  -0.17  0.21  -0.07  -0.18  0.02  0.08  -0.07  -0.08  -0.§72  1.00      Northern Ireland  0.11  -0.05  -0.05  -0.02  -0.11  -0.15  -0.06  -0.04  0.05  -0.07  0.04  0.07  -0.02  -0.06  -0.29  -0.05  1.00    Wales  -0.05  0.22  0.06  -0.00  -0.24  -0.25  -0.14  -0.06  -0.09  -0.02  -0.04  -0.09  -0.05  0.14  -0.52  -0.09  -0.04  1.00    Public research funds per academic  Students per academic  TTO age  Subject diversification  Regional GVA  Regional BERD  City universities  Age  Size  University hospital  Science and medicine  Technical  Social sciences  Arts and humanities  England  Scotland  Northern Ireland  Wales  Public research funds per academic  1.00                                    Students per academic  -0.85  1.00                                  TTO age  0.12  -0.19  1.00                                Subject diversification  -0.65  0.80  -0.14  1.00                              Regional GVA  0.14  -0.11  -0.11  -0.31  1.00                            Regional BERD  0.08  -0.16  0.15  -0.02  0.01  1.00                          City universities  0.18  -0.11  -0.13  -0.31  0.92  -0.11  1.00                        Age  0.41  -0.38  0.16  -0.32  0.01  0.14  -0.03  1.00                      Size  0.51  -0.48  0.19  -0.36  -0.06  0.14  -0.09  0.44  1.00                    University hospital  0.38  -0.41  0.20  -0.38  0.15  0.09  0.14  0.25  0.41  1.00                  Science and medicine  0.56  -0.65  0.10  -0.65  0.21  0.00  0.20  0.35  0.46  0.53  1.00                Technical  -0.12  0.02  0.29  0.11  -0.11  0.08  -0.15  -0.15  0.01  -0.16  -0.29  1.00              Social sciences  -0.44  0.41  -0.03  0.35  -0.06  0.00  -0.04  -0.23  -0.23  -0.30  -0.47  0.22  1.00            Arts and humanities  -0.13  0.19  -0.17  0.02  0.04  -0.02  0.10  -0.04  -0.23  -0.19  -0.54  -0.34  -0.09  1.00          England  -0.04  0.08  0.02  0.07  0.24  0.42  0.24  -0.11  0.09  0.18  0.00  -0.04  0.10  0.00  1.00        Scotland  0.06  -0.09  -0.05  -0.07  -0.06  -0.27  -0.17  0.21  -0.07  -0.18  0.02  0.08  -0.07  -0.08  -0.§72  1.00      Northern Ireland  0.11  -0.05  -0.05  -0.02  -0.11  -0.15  -0.06  -0.04  0.05  -0.07  0.04  0.07  -0.02  -0.06  -0.29  -0.05  1.00    Wales  -0.05  0.22  0.06  -0.00  -0.24  -0.25  -0.14  -0.06  -0.09  -0.02  -0.04  -0.09  -0.05  0.14  -0.52  -0.09  -0.04  1.00  View Large © The Author(s) 2017. Published by Oxford University Press on behalf of the Cambridge Political Economy Society. All rights reserved. This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic.oup.com/journals/pages/about_us/legal/notices) TI - The drivers of efficient knowledge transfer performance: evidence from British universities JF - Cambridge Journal of Economics DO - 10.1093/cje/bex054 DA - 2018-05-01 UR - https://www.deepdyve.com/lp/oxford-university-press/the-drivers-of-efficient-knowledge-transfer-performance-evidence-from-DKBRAzJYE0 SP - 729 EP - 755 VL - 42 IS - 3 DP - DeepDyve ER -