Co-authorship as a proxy for collaboration: a cautionary tale

Co-authorship as a proxy for collaboration: a cautionary tale Abstract International scientific collaboration has risen in the last two decades. Publications using mega-science data must conform to ‘rules of use’ that have emerged to protect the intellectual property of the project staff. These rules enhance co-publication counts and citations and distort the use of co-publication data as a proxy for collaboration. The distorting effects are illustrated by a case study of the BRICS countries that recently issued a declaration on scientific and technological cooperation with specific thematic areas allocated to each country. It is found that with a single exception the designated research areas of collaboration are different to individual country specializations. The disjuncture between such ‘collaboration’ manifests as collaboration in mega-science astronomy and high-energy physics projects especially those of the Planck 2013 telescope and at CERN, Geneva. This raises questions of import to science policy, for the BRICS in particular, and the measurement of scientific collaboration more generally. 1. Introduction Collaboration in science has risen considerably in the last two decades, with internationally co-authored publications now accounting for up to 60% of all scientific publications (UNESCO 2010; NSB 2016). In the same period, to name a few fields, collaborations have proliferated in mega-science Physics, Astronomy, Astrophysics, Health and Genomics projects. Publications emanating from mega-science projects arise from the joint efforts of hundreds of researchers, engineers, software developers, statisticians and technicians, all of whom invest personal commitment and intellectual property in the venture. The way that this corps is acknowledged in scientific publications has given rise to a varied set of ‘rules of use’ that may have the unintended consequence of enhancing co-publication counts. This in turn distorts the use of co-publication data as a proxy for collaboration. Such distortion may be reduced through the use of fractional counting (Plume and van Weijen 2014; NSB 2016) but this approach is hampered by the heavy workload that is entailed in the process of allocating fractional counts by individual, institution and country. This article offers an exploratory study of the distorting effect that participation in mega-science projects exerts on the standard measure of scientific collaboration, namely co-authorship counts. The objective of this study is scientific collaboration among the five ‘BRICS’ countries. Meeting at their first summit in Yekaterinburg in 2009, the then four BRICs countries—Brazil, Russia, India and China—declared the BRICs as a geopolitical organization. In 2011, South Africa joined the BRICs thereby creating a new acronym, BRICS. The BRICS deliberations have produced two concrete results, the founding of the New Development Bank, and the Cape Town/Brasilia Declarations on collaboration in science, technology and innovation (STI). These Declarations lay out an agenda for collaboration with each of the BRICS countries accorded leadership of a thematic area of science and technology. Accordingly, Brazil will take the lead on climate change and disaster mitigation, Russia on water resources and pollution treatment, India on geospatial technology and applications, China on new and renewable energy and energy efficiency, and South Africa on Astronomy. These choices raise questions concerning the concentration of effort in S&T among the BRICS, the state of, and the potential for enlarging S&T collaboration. Why the specific thematic area allocations, and how do these reflect prowess and existing collaboration in the varied fields of science? Some of these issues have received attention through a growing body of literature on the nature of BRICS’s scientific activity that investigates co-publication. Associated methods range from analysis of whole counts through to more complex statistical measures. Mainly, the bibliometric literature examines the macro level. Given the recent emergence of the BRICS entity, little attention has been given to the detailed disciplinary strengths of the BRICS apart from the reasons behind the choices embodied in the Declaration. This article will consider these issues, and in particular examine the nature of scientific collaboration at the micro-level of research areas. In so doing it will address a gap in the literature. The article is structured as follows: following the Introduction, the second section provides a literature review of studies on BRICS S&T co-publication. The third section provides the methodology and presents the main data. The results are considered in the fourth section after which the disjunctures and policy implications are discussed in the fifth and final section. 2. Literature review The usual approach to measure scientific collaboration is by considering the occurrence of co-authored journal publications. The quantification of co-publication has long been recognized to be problematic (Garfield 1979). Part of the difficulty lies in deciding what co-publication actually means, as well as acknowledging that co-publication is prone to double and over-counting. Katz and Martin (1997) estimated the latter error to be in the range from 5% to 10%. Other contributions emphasize the importance of scale effects on co-authorship (Ronda-Pupo and Katz, 2016). Methods of data analysis of co-publications range across whole counts, field-specific counts, citation analysis, and more complex statistical measures, the latter is naturally dependent on assembling a data set that is sufficiently large to permit statistical validity and reliability. While the problem of multiple authorship is acknowledged, the correction through fractional attribution is generally not applied. Finardi (2015) provides a comprehensive overview of the contemporary literature, where there is little to be gained by re-treading that journey. What is important, however, is to draw out those features of the reviewed literature having a bearing upon the arguments that will be presented further below. Kumar and Asheulova (2011) use Scopus data to show that Russian scientists exhibit a low propensity for international co-authorship; the country also displays the highest concentration in Physics and Astronomy among the then four BRICs. Yang et al. (2012) worked with the Web of Science, Science Citation Index-Expanded data in 1991, 2000, and 2009 and applied cluster analysis to determine the extent of disciplinary concentration of the then BRICs in comparison with the G-7 countries. Their findings echo Kumar and Asheulova (2011) except that both Russia and China are now shown to be highly heterogeneous across subject areas, with India mid-way between the more homogeneous Brazil and South Africa. Wagner and Wong (2012) studied the visibility of BRIC science, namely the extent to which the Science Citation Index-Expanded includes journals from these countries, arriving at the finding that they were in fact not under-represented: ‘High quality science from the BRICs appears to be represented at the same level as more advanced countries’ (idem: 1009). Their study covered the year 2011. It is noteworthy that over 2007–2009, Thomson Reuters (2015) set out to improve the coverage of journals from emerging economies as well as languages other than English, so that the above finding is perhaps expected. Yi et al. (2013) examined publications over the 10-year period of 2001–2011, using standard bibliometric tools, with the addition of the disciplinary specialization index (DSI) that measures the relative spread among scientific fields. They obtained relative citation counts as follows: Brazil 6.29, Russia 4.75, India 5.78, China 6.09, and South Africa 8.27, all of which fall below the then world average of 10.53. As to the DSI, the values are—Brazil 0.28, Russia 0.73, India 0.33, China 0.40, and South Africa 0.22, meaning that South Africa is the least specialized and Russia the most, a finding which is in line with Kumar and Asheulova (2011). The country research area h-index and the h-index of country pair research areas may also illuminate the discussion. It is useful to recognize that the h-index is scale dependent, being linked to an author’s total publication count (N). Yong (2014) suggests a power law relationship h ≈ 0.54 N1/2. More recently, Bornmann et al. (2015) investigated the incidence over a period of 1990–2010 of highly cited papers recorded by the Web of Science core collection as a proxy for S&T excellence. South Africa score is on par with China and Taiwan on the production of papers in the top 10% and 1% most cited, and stands ahead of BRICS peers Brazil, Russia, and India, and also ahead of Korea, Turkey, Mexico, and Poland. China shows the sharpest growth in the proportion of highly cited papers. They conclude that ‘an exceedingly robust global science system has emerged, one that is open to new entrants from the BRICS countries, based upon merit’ (idem: 1510). Finardi (2015) considered the period of 1980–2012 that allowed the application of three statistical measures, namely, the Salton and Jaccard indices, and the computation of the probability association index that takes geographic proximity of collaborators into account. For ease of computation, Finardi clustered research areas into four macro fields: natural science, medicine, engineering, and the social sciences and humanities. His conclusions are that collaboration among BRICS pairs is weak compared with their collaborations with other non-BRICS countries namely the USA, United Kingdom, Germany and France, which may be expected. In seeking drivers of collaboration, he finds that geographic distance does not appear to have a strong bearing, but there would appear to be some causality linked to the choice of shared problems in infectious disease, as in the case of HIV/AIDS for Brazil-South Africa collaboration. Additional contributions beyond Finardi’s coverage include Waltman et al. (2011) who study geography as a factor, showing that the mean distance between parties collaborating in science rose from 334 km in 1980 to 1553 km in 2009. In a subsequent paper, the same group of authors (Tijssen et al. 2012: 1) show that ‘collaboration distances and growth rates differ significantly between countries and between fields of science’. Pouris and Ho (2014) used whole counts to examine the co-publishing behaviour of African countries, (South Africa included) finding very high rates of foreign collaboration above the global average, to the extent that ‘Single-author articles appear to be on the verge of extinction on the continent’ (Pouris and Ho 2014: 2183) with the risk of stifling individualism. While noting the very high rate of foreign co-authorship, they did not delve into the nature of the collaboration. Of late, interest to this article is the observation of very high activity indices in medical fields, but much lower values (<1.0) for Astronomy and Physics. In summary, one finds bibliometric techniques ranging from simple counts to advanced statistical analysis applied to data sets that in some cases span a 30-year period. Mainly, the results of the above contributions leave a sense of disappointment due to less explanation. Treating the four BRICs as stable if not at least as static entities over long periods of time is necessary if one has to assemble a large enough data set to apply statistical methods. Yet, by doing so one neglects the considerable changes that have occurred in the four BRICs and on the world stage over this period. Up to the early 1990s, the BRICs countries were relatively closed with limited mobility of staff. In addition, the Internet had not yet proliferated worldwide, with its fundamental changes in the mode of doing science and promoting what Wagner (2008) terms a ‘new invisible college of science’ (cf. Waltman et al. 2011). A caution suggests that a study of BRICS collaboration might be better restricted to a shorter time period, say from the mid-2000s to the present. Accordingly, this contribution therefore considers data over the period of 2009–2014. 3. Methodology and data Bibliometric analysis remains to be the standard tool for examining scientific output (De Bellis, 2009), with co-publication counts understood as the best proxy for scientific collaboration. Without exception the country address of each contributing author serves as the search keyword to seek co-publication occurrences across countries. For practical reasons, BRICS co-publication analysis is restricted to country pairs. Accordingly, the five BRICS countries generate 10 two-country occurrences. Analysis variously draws on the Web of Science or Scopus. This contribution will draw on the Web of Science Core Collection – Science Citation Index-Expanded, Social Science Citation Index and Arts and Humanities Citation Index since these provide deeper coverage of the natural sciences and engineering that are the focus of the two Declarations. Scopus is stronger in the fields such as health, humanities and social sciences. Various choices are present in the selection of fields to demarcate concentration. These choices form heart of bibliometrics since they rest upon the choices made in specifying the granularity of the fields of science, the scope of a given journal, and the allocation of a journal and published article in a specific field; The finer the granularity, the more the number of fields or areas. Moreover, the categorizations vary among databases, and over time (Bartol et al. 2014). So, Essential Science Indicators span twenty-two research fields, to which 11,855 journals are uniquely allocated, while the narrowest categorization of the Web of Science is that of the 252 ‘subject categories’. This analysis employs 151 Web of Science ‘research areas’ that suffice for comparative purposes.1 The search is restricted to the three standard databases, for all document types, and all languages. Table 1 presents the BRICS national S&T thrusts, the thematic areas of the Cape Town/Brasilia Declarations, country research area concentration (%), and revealed comparative advantage (RCA). Research areas shown in italic overlap with BRICS Thrusts. The period of 2009–2014 is chosen as it fits with the formal existence of the BRICs grouping. Table 1. Thematic areas, thrusts, top 10 research area concentrations (%), RCA (2009–14) Brazil  %  RCA  Russia  %  RCA  India  %  RCA  China      South Africa  %  RCA  Climate Change And Disaster Mitigation      Water Resources & Pollution Treatment      Geospatial Technologies/Applications      New & Renewable energy/Efficiency      Astronomy      Thrusts: Bio & Nanotech; Energy; ICT; Health; Biodiversity & Amazon; Climate change; Space science; National security       Thrusts: Energy; Nuclear, Strategic ICT; Health; Space Science      Thrusts: Agric, Health, Energy, Transport & Infrastructure; Environment; Inclusion; Space science      Thrusts: Biotechnology; Food security; Energy sources/Materials; Clean vehicles; Climate change/ Environment      Thrusts: Biotechnology; Renewable energy; Climate change; Poverty alleviation; Space S&T      Agriculture  8.2  4.7  Physics  25.5  3.7  Chemistry  18.6  2.0  Chemistry  18.9  2.0  Chemistry  5.9  0.6  Chemistry  7.0  0.7  Chemistry  16.9  1.8  Physics  11.8  1.7  Physics  12.9  1.9  Environmental Sciences & Ecology  5.8  2.0  Physics  6.1  0.9  Mathematics  6.2  2.1  Engineering  10.1  1.3  Engineering  12.9  1.7  Engineering  5.0  0.7  Engineering  5.4  0.7  Material Science  6.0  1.2  Material Science  9.6  2.0  Material Science  11.8  2.4  Physics  4.6  0.7  Biochemistry & Molecular Biology  4.3  1.0  Engineering  6.0  0.8  S&T Other Topics  5.8  1.5  S&T Other Topics  6.3  1.6  Infectious Diseases  4.6  4.9  Neurosciences & Neurology  4.1  0.9  Biochem & Molecular Biology  4.5  1.0  Pharmacology & Pharmacy  5.7  1.8  Mathematics  5.5  1.9  Plant Sciences  4.5  3.5  Pharmacology & Pharmacy  3.6  1.2  Astronomy  4.3  4.3  Biochemistry & Molecular Biology  4.6  1.0  Biochemistry & Molecular Biology  5.0  1.1  S&T Other Topics  4.0  1.1  Public Env & OCC Health  3.6  1.9  Optics  3.4  2.5  Agriculture  3.8  2.2  Computer Science  3.7  1.6  General/Internal Medicine  3.7  1.2  Veterinary Sciences  3.4  3.4  Geology  3.2  2.8  Environmental Sciences & Ecology  3.4  1.1  Environmental Sciences &Ecology  3.4  1.1  Public Env & OCC Health  3.5  1.9  Material Science  3.2  0.7  Instruments Instrumentation  2.5  3.6  Biotech & Applied Microbiology  3.1  1.8  Pharmacology Pharmacy  3.3  1.0  Psychology  3.5  1.4  Brazil  %  RCA  Russia  %  RCA  India  %  RCA  China      South Africa  %  RCA  Climate Change And Disaster Mitigation      Water Resources & Pollution Treatment      Geospatial Technologies/Applications      New & Renewable energy/Efficiency      Astronomy      Thrusts: Bio & Nanotech; Energy; ICT; Health; Biodiversity & Amazon; Climate change; Space science; National security       Thrusts: Energy; Nuclear, Strategic ICT; Health; Space Science      Thrusts: Agric, Health, Energy, Transport & Infrastructure; Environment; Inclusion; Space science      Thrusts: Biotechnology; Food security; Energy sources/Materials; Clean vehicles; Climate change/ Environment      Thrusts: Biotechnology; Renewable energy; Climate change; Poverty alleviation; Space S&T      Agriculture  8.2  4.7  Physics  25.5  3.7  Chemistry  18.6  2.0  Chemistry  18.9  2.0  Chemistry  5.9  0.6  Chemistry  7.0  0.7  Chemistry  16.9  1.8  Physics  11.8  1.7  Physics  12.9  1.9  Environmental Sciences & Ecology  5.8  2.0  Physics  6.1  0.9  Mathematics  6.2  2.1  Engineering  10.1  1.3  Engineering  12.9  1.7  Engineering  5.0  0.7  Engineering  5.4  0.7  Material Science  6.0  1.2  Material Science  9.6  2.0  Material Science  11.8  2.4  Physics  4.6  0.7  Biochemistry & Molecular Biology  4.3  1.0  Engineering  6.0  0.8  S&T Other Topics  5.8  1.5  S&T Other Topics  6.3  1.6  Infectious Diseases  4.6  4.9  Neurosciences & Neurology  4.1  0.9  Biochem & Molecular Biology  4.5  1.0  Pharmacology & Pharmacy  5.7  1.8  Mathematics  5.5  1.9  Plant Sciences  4.5  3.5  Pharmacology & Pharmacy  3.6  1.2  Astronomy  4.3  4.3  Biochemistry & Molecular Biology  4.6  1.0  Biochemistry & Molecular Biology  5.0  1.1  S&T Other Topics  4.0  1.1  Public Env & OCC Health  3.6  1.9  Optics  3.4  2.5  Agriculture  3.8  2.2  Computer Science  3.7  1.6  General/Internal Medicine  3.7  1.2  Veterinary Sciences  3.4  3.4  Geology  3.2  2.8  Environmental Sciences & Ecology  3.4  1.1  Environmental Sciences &Ecology  3.4  1.1  Public Env & OCC Health  3.5  1.9  Material Science  3.2  0.7  Instruments Instrumentation  2.5  3.6  Biotech & Applied Microbiology  3.1  1.8  Pharmacology Pharmacy  3.3  1.0  Psychology  3.5  1.4  Table 2 provides a narrower view using 2014 data for publication counts, concentration in the top five research areas plus Astronomy, RCA, and research area h-index. When the research area count exceeds 10,000 publications, the h-index is calculated by displaying all country publications in the research area ranked by the number of citations from highest to lowest. The country h-index is then calculated by its inspection. Table 2. Whole count, concentration, RCA, and h-index (2014) Brazil N = 48498        Russia N = 34680        India N = 67517        China N = 275205        South Africa N = 14326          %  RCA  h    %  RCA  h    %  RCA  h    %  RCA  h    %  RCA  h  Agriculture  8  4.76  22  Physics  24  3.60  59  Chemistry  19  1.98  68  Chemistry  18  1.86  169  Infectious diseases  6  6.46  32  Chemistry  8  0.77  38  Chemistry  17  1.70  44  Physics  12  1.75  54  Engineering  14  1.74  102  Chemistry  6  0.60  30  Physics  6  0.92  52  Material science  7  1.25  36  Material science  10  1.83  51  Material science  12  2.25  138  Environment  6  1.84  31  Engineering  6  0.71  30  Engineering  6  0.81  28  Engineering  10  1.31  47  Physics  11  1.68  126  Engineering  5  0.64  27  Biochemistry  4  1.05  30  Mathematics  6  2.07  18  S&T other  7  1.32  54  S&T other  8  1.63  151  Physics  5  0.75  38  Astronomy  2  1.76  46  Astronomy  4  4.46  51  Astronomy  2  1.72  51  Astronomy  1  0.78  50  Astronomy  3  3.60  54  Brazil N = 48498        Russia N = 34680        India N = 67517        China N = 275205        South Africa N = 14326          %  RCA  h    %  RCA  h    %  RCA  h    %  RCA  h    %  RCA  h  Agriculture  8  4.76  22  Physics  24  3.60  59  Chemistry  19  1.98  68  Chemistry  18  1.86  169  Infectious diseases  6  6.46  32  Chemistry  8  0.77  38  Chemistry  17  1.70  44  Physics  12  1.75  54  Engineering  14  1.74  102  Chemistry  6  0.60  30  Physics  6  0.92  52  Material science  7  1.25  36  Material science  10  1.83  51  Material science  12  2.25  138  Environment  6  1.84  31  Engineering  6  0.71  30  Engineering  6  0.81  28  Engineering  10  1.31  47  Physics  11  1.68  126  Engineering  5  0.64  27  Biochemistry  4  1.05  30  Mathematics  6  2.07  18  S&T other  7  1.32  54  S&T other  8  1.63  151  Physics  5  0.75  38  Astronomy  2  1.76  46  Astronomy  4  4.46  51  Astronomy  2  1.72  51  Astronomy  1  0.78  50  Astronomy  3  3.60  54  The third data set comprises country co-publication rates (Table 3). The co-publication subject count (CSC) is obtained as: author address = ‘country X’ AND author address = ‘country Y’ AND publication period = ‘Z’. The two leading research area(s) for country pair co-publication are shown as percentages of country pair collaboration. The relevant h-index is also amended for the leading subject areas. Table 3. Co-publication count, concentration, and country h-index (2014)   Russia  h  India  h  China  h  South Africa  h  Brazil  N = 609    N = 625    N = 833    N = 349    Physics 65  48  Physics 43  45  Physics 43  46  Physics 32  32  Astronomy 23  34  Astronomy 16  31  Astronomy 15  35  Astronomy 13  22  Russia      N = 576    N = 1127    N = 295    Physics 57  47  Physics 47  50  Physics 41  34  Astronomy 31  48  Astronomy 16  36  Astronomy 34  44  India          N = 1183        Physics 31.0    Physics 19.4  Astronomy 12.8    Astronomy 11.1  China              Physics 32.0    Astronomy 13.9    Russia  h  India  h  China  h  South Africa  h  Brazil  N = 609    N = 625    N = 833    N = 349    Physics 65  48  Physics 43  45  Physics 43  46  Physics 32  32  Astronomy 23  34  Astronomy 16  31  Astronomy 15  35  Astronomy 13  22  Russia      N = 576    N = 1127    N = 295    Physics 57  47  Physics 47  50  Physics 41  34  Astronomy 31  48  Astronomy 16  36  Astronomy 34  44  India          N = 1183        Physics 31.0    Physics 19.4  Astronomy 12.8    Astronomy 11.1  China              Physics 32.0    Astronomy 13.9  All data were recorded in the first week of June 2017. 4. Data analysis The data are now analysed table by table. It is noted that Kahn (2015) previously observed the lack of correspondence between country national S&T thrusts and the BRICS thematic areas italicized. So, on one hand Brazil has a national thrust in climate change and is allocated the same BRICS thematic area, and China exhibits correspondence between the national thrust of ‘New energy sources & materials, Clean energy vehicles’ and the BRICS thematic area of ‘New and renewable energy; and energy efficiency’. On the other hand, for India one might have to stretch the correspondence between the national thrust ‘Space Science and Technology’ and the BRICS thematic area of ‘Geospatial technology and applications’. While important in and of itself, Russia’s thematic area ‘Water resources’ shows no obvious correspondence with her national thrusts. In the case of South Africa, one might very tentatively join the dots between the national thrust ‘Space Science and Technology’ and her BRICS thematic area of Astronomy. That prior analysis is now sharpened by extending the list of research areas and including the associated revealed comparative advantage (RCA) in Table 1. The value of considering data over 2009–14 is that it refers to the period immediately prior to the Cape Town declaration, and the country emphasis so recorded might have informed the choice of thematic areas. Even so, examination of the 10 leading country research area concentrations and RCAs sheds little further light on the basis by which the thematic areas were identified and allocated. Granted, with the exception of Astronomy, the thematic areas are both multi-disciplinary in nature, and strongly applied in orientation. The thematic area of Astronomy corresponds to a unique research area; whereas, the other four do not. Insofar as concentration goes, Astronomy is less studied except in the case of Russia. On the basis of high RCA, Brazilian agriculture and South Africa’s expertise in infectious diseases would be leading candidates, along with Russian Astronomy, yet these areas have not been prioritized with immediate action. Russia, India, and China show high concentration in Chemistry, Physics, and Material science, while for Brazil and South Africa concentration is more evenly spread. Table 2 presents 2014 data for the top five research areas in terms of concentration, RCA and country-level h-index. Except for South Africa, for which Infectious Disease emerges as an important research area, the restricted study period does not yield anything new. What is striking however is the h-index information that shows comparably high values for Physics and Astronomy. Noting Yong (2014), China, the BRICS country with largest volume of publications has much higher h-index scores for its top five research areas. China’s publication output is greater than the other four countries combined. Regarding Astronomy, the RCAs vary from Russia’s 4.46 and South Africa’s 3.60 down to China below world average at 0.78. By the measure of RCA, allocating Astronomy to South Africa ‘makes sense’. In Table 3, one notes the collaborative research areas of highest concentration, namely, Physics and Astronomy. For Brazil, Physics accounts for 6% of publications, but when it comes to Brazil’s co-publication in Physics with the other four countries the concentration leaps to 65, 43, 43, and 32%, respectively. Among the BRICS, the median level for co-authored papers in Physics is 32%. For Brazil, Astronomy accounts for 2% of its publications, but when it comes to co-publication with the other four countries the concentration leaps to 23, 16, 15, and 13%, respectively. For Astronomy, the median level is 16% of all co-publications. For Russia, that already exhibits high domestic concentration in Physics of 24%, co-authorship with the other four BRICS stands at 65, 57, 47 and 41%, respectively. Again, international co-authorship stands out above domestic activity. What then explains this observation? One may offer the conjecture that the concentration arises through the participation of BRICS scientists in mega-science projects of contemporary Physics and Astronomy, such as high-energy Physics research at CERN, Geneva, and the data analysis associated with the Planck satellite observatory through the Planck 2013 Collaboration. This conjecture may be verified by ranking abstracts of publications in Physics, and Astronomy by citation frequency. In Physics, for example, the publications cited at least five times or more since 2014 are overwhelmingly attributed to mega-science projects such as ATLAS, CMS, STAR, ALICE, and LHCb, with those in Astronomy including the Planck 2013 data analysis in addition to high-energy Physics projects already mentioned. This suggests that the bulk of BRICS collaboration in Physics and Astronomy takes place via the medium of international mega-science projects. The conjecture holds true for the co-authorship data obtained from Elsevier Scopus. In essence, the multi-author publications have high citation rates that pull up the h-index. The conjecture that collaboration in Physics or Astronomy/Astrophysics is dominated by mega-science projects is essentially supported. 5. Discussion and implications for policy The BRICS countries have laid out a declaration for collaboration with five thematic areas highlighted for attention. It turns out, however, that ‘collaboration’, is currently dominated through the mechanism of mega-science projects have not been generated through the desire of the BRICS countries to collaborate in S&T. It is found that with a single exception the dominant research areas of collaboration are different to individual country specializations. The difference arises from the participation of scientists in mega-science projects, where the rules of participation and scale of authorship create distortions in the visibility of these projects in the indexed literature. The disjuncture between such ‘collaboration’ and the intent of the Declarations raises questions of import to science policy, for the BRICS in particular and the measurement of scientific collaboration more generally. How shall one understand the influence of participation in mega-science projects on co-publication, let alone cooperation? Hand (2010) and Ebrahim et al. (2013) point to the (desirable) increase in publication counts resulting from international collaboration, while Birnholtz (2006, 2008) gives specific attention to the problem of attribution in the large projects at CERN, such as the ATLAS detector that involves technical, theoretical and experimental work. Such issues appear in all large research projects, including astronomy and the health sciences. Hogg et al. (2014) point to the emergence of well-crafted protocols that guide article writers in their task. Dance (2012) further discusses the issue of who should be the lead author. None of the above contributions consider the distortion of publication counts arising from the massification of research effort as in the 900 student authors2 of a paper on the genome of the fruit fly. For an in-depth analysis into the problem one may turn to King (2012) Participation in ATLAS, STAR, CMS, etc. requires competence in particle Physics, with participation coming at low cost, and being available remotely. The same holds true for working on Planck satellite data or the Sloan Digital Sky Survey. One can work on ATLAS anywhere, anytime, provided that one can get online. The typical ATLAS publication hereby involves hundreds of ‘authors’ most of whom do not know one another, who will never meet, and never correspond. ATLAS Authorship policy is carefully specified.3 Indeed, it is a condition of working with ATLAS data where all participants are named, so that massive co-authorship is designed into the governance protocol.4 Moreover, the official style guide requires that the word ‘ATLAS’ must be included in the article title, hence the ease of search (Eisenhandler 2013). The question this raises is what significance should be attached to co-publication as a proxy for cooperation? Can one really talk of BRICS collaboration in Physics and Astronomy or are the co-authorship counts a bibliometric artefact? ‘Collaboration’ among the BRICS is an unintended outcome of international mega-science projects that predate the establishment of the grouping and that do not directly relate to the agenda of the Cape Town/Brasilia Declarations. The ‘high’ levels of intra-BRICS collaboration in Physics and Astronomy are in part driven by the protocols pertaining to working with mega-science data. As a case in point, for generations South African science policy has laid claim to the ‘southern geographic advantage’ as a motivator to fund research in astronomy and palaeontology. Yet, the actual practice of Astronomy is overwhelmingly international, and quite independent of locality, the more so when a satellite observatory is used. Palaeontology is obviously geographically specific. A next step would be to examine the effect of other mega-science projects on co-publication: large terrestrial telescopes, projects such as the Global Burden of Disease, multi-site clinical trials, and genomic studies immediately come to mind. How then to adjust bibliometric analysis to account for such distortions? Should large-scale collaborations be placed in a separate bibliometric category lest their inclusion acts to distort and exaggerate actual peer-to-peer interaction? Should it be mandatory to introduce fractional counts when an article involves more than a specified minimum number of contributors? Would some type of scale factor serve to normalize the effect of mega-science? Given the huge dominance of ATLAS, CMS, STAR, ALICE and other large collaborations-based research in the revealed ‘collaboration’ among the five BRICS it might be more realistic to remove such counts and those associated with the Planck telescope entirely. This radical change would of course depreciate the reported level of co-publication among the BRICS. For the present one, it is cautioned that care has to be taken in interpreting co-publication counts lest one make claims for collaboration that are at best tenuous, and at worst misleading. This is not to suggest that the BRICS countries should limit their involvement in mega-science projects let alone restrict the development of such projects in their own countries. One thinks, for example, of large-scale nuclear research at Dubna in Russia, the projects like new optical telescope in China and the incipient Square Kilometre Array in South Africa. What these projects have in common is the generation of Big Data. The capacity of innovation systems large and small, advanced or emergent, in the BRICS or elsewhere, to clean, process, and analyse Big Data is now a core attribute for their success going forward. The development and sharing of the appropriate tools arise naturally through international collaboration, even where the collaborators operate remotely attaining ‘satisfaction at a distance’. These are many issues for future research. The evidence is that present BRICS S&T collaboration is dominant in Physics and Astronomy, and that this is taking place through mega-science. Highly cited BRICS collaboration in Physics does not entail small lab-based groups tackling problems of immediate value to national thrusts or the BRICS themes. This lack of alignment presents serious challenges for BRICS S&T policy specifically, and international S&T collaboration in general. Acknowledgements This work has taken shape over 2 years during which the Author benefitted from conference attendance supported by his host institution. The critical assessment of an external reviewer is gratefully acknowledged. References Bartol T., Budimir G., Dekleva-Smrekar D. et al.   ( 2014) ‘ Assessment of Research Fields in Scopus and Web of Science in the View of National Research Evaluation in Slovenia’. Scientometrics , 98/ 2: 1491– 504. Google Scholar CrossRef Search ADS   Birnholtz J. ( 2006) ‘ What Does it Mean to be an Author? The Intersection of Credit, Contribution, and Collaboration in Science’. Journal of the American Society for Information Science and Technology , 57/ 13: 1758– 70. Google Scholar CrossRef Search ADS   Birnholtz J. ( 2008) ‘ When Authorship Isn’t Enough: Lessons from CERN on the Implications of Formal and Informal Credit Attribution Mechanisms in Collaborative Research’. The Journal of Electronic Publishing , 11/ 1. DOI: 10.3998/3336451.0011.105. Bornmann L., Wagner C., Leydesdorff L. ( 2015) ‘ BRICS Countries and Scientific Excellence: A Bibliometric Analysis of Most Frequently-cited Papers’. Journal of the Association for Information Science and Technology , 66/ 7: 1507– 13. Google Scholar CrossRef Search ADS   BRICS. ( 2014) BRICS Science, Technology and Innovation Cooperation: A Strategic Partnership for Equitable Growth and Sustainable Development <http://www.brics.utoronto.ca/docs/140210-BRICS-STI.pdf> accessed 2 September 2015. Dance A. ( 2012) ‘ Authorship: Who's on First?’ Nature , 489: 591– 3. Google Scholar CrossRef Search ADS PubMed  De Bellis N. ( 2009) Bibliometrics and Citation Analysis . Lanham, MD: Scarecrow Press. DST. ( 2008) Ten-Year Plan for Innovation . Pretoria: Department of Science and Technology. Ebrahim N.A., Salehi H., Embi M.A. et al.   ( 2013) ‘ Effective Strategies for Increasing Citation Frequency’. International Education Studies , 6/ 11: 93– 9. Eisenhandler E. ( 2013) ATLAS Style Guide Version 2.4 . London: Queen Mary College. Finardi U. ( 2015) ‘ Scientific Collaboration between BRICS Countries’. Scientometrics , 102: 1139– 66. Google Scholar CrossRef Search ADS   Garfield E. ( 1979) ‘ Is Citation Analysis a Legitimate Evaluation Tool?’ Scientometrics , 1/ 4: 359– 75. Google Scholar CrossRef Search ADS   Hand E. ( 2010) ‘ Big Science Spurs Collaborative Trend’. Nature , 463: 282. (doi:10.1038/463282a Google Scholar CrossRef Search ADS PubMed  Hogg W., Donskov M., Russell G. et al.   ( 2014) ‘ Approach to Publishing for Large Health Services Research Projects’. Canadian Family Physician , 60/ 9: 854– 5. Google Scholar PubMed  Kahn M.J. ( 2015) ‘ Cooperation in Science, Technology and Innovation: Rhetoric and Realities’. Contexto Int. [Online] , 37/ 1: 185– 213. Google Scholar CrossRef Search ADS   Katz J.S., Martin B.R. ( 1997) ‘ What is Research Collaboration?’. Research Policy , 26: 1– 18. Google Scholar CrossRef Search ADS   King C. ( 2012). ‘Multiauthor Papers: Onward and Upward’. ScienceWatch Newsletter, 1– 4. <http://archive.sciencewatch.com/newsletter/2012/201207/multiauthor_papers/> Kumar N., Asheulova N. ( 2011) ‘ Comparative Analysis of Scientific Output of BRIC Countries’. Annals of Library and Information Studies , 58: 228– 36. Meissner D., Gokhberg L., Sokolov A (Eds) ( 2013) Science, Technology and Innovation Policy for the Future . Berlin: Springer. Google Scholar CrossRef Search ADS   MST. ( 2007) Science, Technology and Innovation for National Development Action Plan 2007-2010. Brasilia: Ministry of Science and Technology. NSB ( 2016) Science and Engineering Indicators 2016 . Washington, DC: National Science Board. Plume A., van Weijen D. ( 2014). ‘Publish or Perish? The Rise of the Fractional Author…’. Research Trends, 38. <https://www.researchtrends.com/issue-38-september-2014/publish-or-perish-the-rise-of-the-fractional-author/> Pouris A., Ho Y.-S. ( 2014) ‘ Research Emphasis and Collaboration in Africa’, Scientometrics , 98: 2169– 84. Google Scholar CrossRef Search ADS   Ronda-Pupo G.A., Sylvan Katz J. ( 2016) ‘ The Scaling Relationship between Citation-based Performance and International Collaboration of Cuban Articles in Natural Sciences’. Scientometrics , 107/ 3: 1423– 34. Google Scholar CrossRef Search ADS   Thomson Reuters, ( 2015) <http://wokinfo.com/essays/globalization-of-web-of-science/> Tijssen R., Waltman L., Van Eck N. ( 2012) ‘Research Collaboration and the Expanding Science Grid: Measuring Globalization Processes Worldwide’, pp. 1– 12. <http://arxiv.org/abs/1203.4194> UNESCO ( 2010) World Science Report 2010 . Paris: UNESCO. Wagner C. ( 2008). The New Invisible College: Science for Development . Washington, DC: Brookings Institution Press. Wagner C., Wong S. K. ( 2012) ‘ Unseen Science? Representation of BRICs in Global Science’. Scientometrics , 90: 1001– 13. Google Scholar CrossRef Search ADS   Waltman L., Tijssen R., Van Eck N. ( 2011) ‘ Globalisation of Science in Kilometres’, Journal of Informetrics , 5/ 4: 574– 82. Google Scholar CrossRef Search ADS   Yi Y., Qi W., Wu D. ( 2013) ‘ Are CIVETS the Next BRICs? A Comparative Analysis from Scientometrics Perspective’. Scientometrics , 94: 615– 28. Google Scholar CrossRef Search ADS   Yang L. Y., Yue T., Ding J. L., Han T. ( 2012) ‘ A Comparison of Disciplinary Structure in Science Between the G7 and the BRIC Countries by Bibliometric Methods’, Scientometrics , 93: 497– 516. Google Scholar CrossRef Search ADS   Yong A. ( 2014) ‘ Critique of Hirsch's Citation Index: A Combinatorial Fermi Problem’, Notices of the American Mathematical Society , 61/ 9: 1040– 50. Google Scholar CrossRef Search ADS   Footnotes 1 http://ipscience-help.thomsonreuters.com/inCites2Live/filterValuesGroup/researchAreaSchema.html 2 <http://www.sciencedaily.com/releases/2015/05/150511095353.htm> 3 <https://twiki.cern.ch/twiki/bin/view/Main/ATLASAuthorshipPolicy> 4 The author is grateful for these insights to an anonymous physicist with work experience at CERN. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Science and Public Policy Oxford University Press

Co-authorship as a proxy for collaboration: a cautionary tale

Loading next page...
 
/lp/ou_press/co-authorship-as-a-proxy-for-collaboration-a-cautionary-tale-SdVdY9Qf68
Publisher
Oxford University Press
Copyright
© The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
ISSN
0302-3427
eISSN
1471-5430
D.O.I.
10.1093/scipol/scx052
Publisher site
See Article on Publisher Site

Abstract

Abstract International scientific collaboration has risen in the last two decades. Publications using mega-science data must conform to ‘rules of use’ that have emerged to protect the intellectual property of the project staff. These rules enhance co-publication counts and citations and distort the use of co-publication data as a proxy for collaboration. The distorting effects are illustrated by a case study of the BRICS countries that recently issued a declaration on scientific and technological cooperation with specific thematic areas allocated to each country. It is found that with a single exception the designated research areas of collaboration are different to individual country specializations. The disjuncture between such ‘collaboration’ manifests as collaboration in mega-science astronomy and high-energy physics projects especially those of the Planck 2013 telescope and at CERN, Geneva. This raises questions of import to science policy, for the BRICS in particular, and the measurement of scientific collaboration more generally. 1. Introduction Collaboration in science has risen considerably in the last two decades, with internationally co-authored publications now accounting for up to 60% of all scientific publications (UNESCO 2010; NSB 2016). In the same period, to name a few fields, collaborations have proliferated in mega-science Physics, Astronomy, Astrophysics, Health and Genomics projects. Publications emanating from mega-science projects arise from the joint efforts of hundreds of researchers, engineers, software developers, statisticians and technicians, all of whom invest personal commitment and intellectual property in the venture. The way that this corps is acknowledged in scientific publications has given rise to a varied set of ‘rules of use’ that may have the unintended consequence of enhancing co-publication counts. This in turn distorts the use of co-publication data as a proxy for collaboration. Such distortion may be reduced through the use of fractional counting (Plume and van Weijen 2014; NSB 2016) but this approach is hampered by the heavy workload that is entailed in the process of allocating fractional counts by individual, institution and country. This article offers an exploratory study of the distorting effect that participation in mega-science projects exerts on the standard measure of scientific collaboration, namely co-authorship counts. The objective of this study is scientific collaboration among the five ‘BRICS’ countries. Meeting at their first summit in Yekaterinburg in 2009, the then four BRICs countries—Brazil, Russia, India and China—declared the BRICs as a geopolitical organization. In 2011, South Africa joined the BRICs thereby creating a new acronym, BRICS. The BRICS deliberations have produced two concrete results, the founding of the New Development Bank, and the Cape Town/Brasilia Declarations on collaboration in science, technology and innovation (STI). These Declarations lay out an agenda for collaboration with each of the BRICS countries accorded leadership of a thematic area of science and technology. Accordingly, Brazil will take the lead on climate change and disaster mitigation, Russia on water resources and pollution treatment, India on geospatial technology and applications, China on new and renewable energy and energy efficiency, and South Africa on Astronomy. These choices raise questions concerning the concentration of effort in S&T among the BRICS, the state of, and the potential for enlarging S&T collaboration. Why the specific thematic area allocations, and how do these reflect prowess and existing collaboration in the varied fields of science? Some of these issues have received attention through a growing body of literature on the nature of BRICS’s scientific activity that investigates co-publication. Associated methods range from analysis of whole counts through to more complex statistical measures. Mainly, the bibliometric literature examines the macro level. Given the recent emergence of the BRICS entity, little attention has been given to the detailed disciplinary strengths of the BRICS apart from the reasons behind the choices embodied in the Declaration. This article will consider these issues, and in particular examine the nature of scientific collaboration at the micro-level of research areas. In so doing it will address a gap in the literature. The article is structured as follows: following the Introduction, the second section provides a literature review of studies on BRICS S&T co-publication. The third section provides the methodology and presents the main data. The results are considered in the fourth section after which the disjunctures and policy implications are discussed in the fifth and final section. 2. Literature review The usual approach to measure scientific collaboration is by considering the occurrence of co-authored journal publications. The quantification of co-publication has long been recognized to be problematic (Garfield 1979). Part of the difficulty lies in deciding what co-publication actually means, as well as acknowledging that co-publication is prone to double and over-counting. Katz and Martin (1997) estimated the latter error to be in the range from 5% to 10%. Other contributions emphasize the importance of scale effects on co-authorship (Ronda-Pupo and Katz, 2016). Methods of data analysis of co-publications range across whole counts, field-specific counts, citation analysis, and more complex statistical measures, the latter is naturally dependent on assembling a data set that is sufficiently large to permit statistical validity and reliability. While the problem of multiple authorship is acknowledged, the correction through fractional attribution is generally not applied. Finardi (2015) provides a comprehensive overview of the contemporary literature, where there is little to be gained by re-treading that journey. What is important, however, is to draw out those features of the reviewed literature having a bearing upon the arguments that will be presented further below. Kumar and Asheulova (2011) use Scopus data to show that Russian scientists exhibit a low propensity for international co-authorship; the country also displays the highest concentration in Physics and Astronomy among the then four BRICs. Yang et al. (2012) worked with the Web of Science, Science Citation Index-Expanded data in 1991, 2000, and 2009 and applied cluster analysis to determine the extent of disciplinary concentration of the then BRICs in comparison with the G-7 countries. Their findings echo Kumar and Asheulova (2011) except that both Russia and China are now shown to be highly heterogeneous across subject areas, with India mid-way between the more homogeneous Brazil and South Africa. Wagner and Wong (2012) studied the visibility of BRIC science, namely the extent to which the Science Citation Index-Expanded includes journals from these countries, arriving at the finding that they were in fact not under-represented: ‘High quality science from the BRICs appears to be represented at the same level as more advanced countries’ (idem: 1009). Their study covered the year 2011. It is noteworthy that over 2007–2009, Thomson Reuters (2015) set out to improve the coverage of journals from emerging economies as well as languages other than English, so that the above finding is perhaps expected. Yi et al. (2013) examined publications over the 10-year period of 2001–2011, using standard bibliometric tools, with the addition of the disciplinary specialization index (DSI) that measures the relative spread among scientific fields. They obtained relative citation counts as follows: Brazil 6.29, Russia 4.75, India 5.78, China 6.09, and South Africa 8.27, all of which fall below the then world average of 10.53. As to the DSI, the values are—Brazil 0.28, Russia 0.73, India 0.33, China 0.40, and South Africa 0.22, meaning that South Africa is the least specialized and Russia the most, a finding which is in line with Kumar and Asheulova (2011). The country research area h-index and the h-index of country pair research areas may also illuminate the discussion. It is useful to recognize that the h-index is scale dependent, being linked to an author’s total publication count (N). Yong (2014) suggests a power law relationship h ≈ 0.54 N1/2. More recently, Bornmann et al. (2015) investigated the incidence over a period of 1990–2010 of highly cited papers recorded by the Web of Science core collection as a proxy for S&T excellence. South Africa score is on par with China and Taiwan on the production of papers in the top 10% and 1% most cited, and stands ahead of BRICS peers Brazil, Russia, and India, and also ahead of Korea, Turkey, Mexico, and Poland. China shows the sharpest growth in the proportion of highly cited papers. They conclude that ‘an exceedingly robust global science system has emerged, one that is open to new entrants from the BRICS countries, based upon merit’ (idem: 1510). Finardi (2015) considered the period of 1980–2012 that allowed the application of three statistical measures, namely, the Salton and Jaccard indices, and the computation of the probability association index that takes geographic proximity of collaborators into account. For ease of computation, Finardi clustered research areas into four macro fields: natural science, medicine, engineering, and the social sciences and humanities. His conclusions are that collaboration among BRICS pairs is weak compared with their collaborations with other non-BRICS countries namely the USA, United Kingdom, Germany and France, which may be expected. In seeking drivers of collaboration, he finds that geographic distance does not appear to have a strong bearing, but there would appear to be some causality linked to the choice of shared problems in infectious disease, as in the case of HIV/AIDS for Brazil-South Africa collaboration. Additional contributions beyond Finardi’s coverage include Waltman et al. (2011) who study geography as a factor, showing that the mean distance between parties collaborating in science rose from 334 km in 1980 to 1553 km in 2009. In a subsequent paper, the same group of authors (Tijssen et al. 2012: 1) show that ‘collaboration distances and growth rates differ significantly between countries and between fields of science’. Pouris and Ho (2014) used whole counts to examine the co-publishing behaviour of African countries, (South Africa included) finding very high rates of foreign collaboration above the global average, to the extent that ‘Single-author articles appear to be on the verge of extinction on the continent’ (Pouris and Ho 2014: 2183) with the risk of stifling individualism. While noting the very high rate of foreign co-authorship, they did not delve into the nature of the collaboration. Of late, interest to this article is the observation of very high activity indices in medical fields, but much lower values (<1.0) for Astronomy and Physics. In summary, one finds bibliometric techniques ranging from simple counts to advanced statistical analysis applied to data sets that in some cases span a 30-year period. Mainly, the results of the above contributions leave a sense of disappointment due to less explanation. Treating the four BRICs as stable if not at least as static entities over long periods of time is necessary if one has to assemble a large enough data set to apply statistical methods. Yet, by doing so one neglects the considerable changes that have occurred in the four BRICs and on the world stage over this period. Up to the early 1990s, the BRICs countries were relatively closed with limited mobility of staff. In addition, the Internet had not yet proliferated worldwide, with its fundamental changes in the mode of doing science and promoting what Wagner (2008) terms a ‘new invisible college of science’ (cf. Waltman et al. 2011). A caution suggests that a study of BRICS collaboration might be better restricted to a shorter time period, say from the mid-2000s to the present. Accordingly, this contribution therefore considers data over the period of 2009–2014. 3. Methodology and data Bibliometric analysis remains to be the standard tool for examining scientific output (De Bellis, 2009), with co-publication counts understood as the best proxy for scientific collaboration. Without exception the country address of each contributing author serves as the search keyword to seek co-publication occurrences across countries. For practical reasons, BRICS co-publication analysis is restricted to country pairs. Accordingly, the five BRICS countries generate 10 two-country occurrences. Analysis variously draws on the Web of Science or Scopus. This contribution will draw on the Web of Science Core Collection – Science Citation Index-Expanded, Social Science Citation Index and Arts and Humanities Citation Index since these provide deeper coverage of the natural sciences and engineering that are the focus of the two Declarations. Scopus is stronger in the fields such as health, humanities and social sciences. Various choices are present in the selection of fields to demarcate concentration. These choices form heart of bibliometrics since they rest upon the choices made in specifying the granularity of the fields of science, the scope of a given journal, and the allocation of a journal and published article in a specific field; The finer the granularity, the more the number of fields or areas. Moreover, the categorizations vary among databases, and over time (Bartol et al. 2014). So, Essential Science Indicators span twenty-two research fields, to which 11,855 journals are uniquely allocated, while the narrowest categorization of the Web of Science is that of the 252 ‘subject categories’. This analysis employs 151 Web of Science ‘research areas’ that suffice for comparative purposes.1 The search is restricted to the three standard databases, for all document types, and all languages. Table 1 presents the BRICS national S&T thrusts, the thematic areas of the Cape Town/Brasilia Declarations, country research area concentration (%), and revealed comparative advantage (RCA). Research areas shown in italic overlap with BRICS Thrusts. The period of 2009–2014 is chosen as it fits with the formal existence of the BRICs grouping. Table 1. Thematic areas, thrusts, top 10 research area concentrations (%), RCA (2009–14) Brazil  %  RCA  Russia  %  RCA  India  %  RCA  China      South Africa  %  RCA  Climate Change And Disaster Mitigation      Water Resources & Pollution Treatment      Geospatial Technologies/Applications      New & Renewable energy/Efficiency      Astronomy      Thrusts: Bio & Nanotech; Energy; ICT; Health; Biodiversity & Amazon; Climate change; Space science; National security       Thrusts: Energy; Nuclear, Strategic ICT; Health; Space Science      Thrusts: Agric, Health, Energy, Transport & Infrastructure; Environment; Inclusion; Space science      Thrusts: Biotechnology; Food security; Energy sources/Materials; Clean vehicles; Climate change/ Environment      Thrusts: Biotechnology; Renewable energy; Climate change; Poverty alleviation; Space S&T      Agriculture  8.2  4.7  Physics  25.5  3.7  Chemistry  18.6  2.0  Chemistry  18.9  2.0  Chemistry  5.9  0.6  Chemistry  7.0  0.7  Chemistry  16.9  1.8  Physics  11.8  1.7  Physics  12.9  1.9  Environmental Sciences & Ecology  5.8  2.0  Physics  6.1  0.9  Mathematics  6.2  2.1  Engineering  10.1  1.3  Engineering  12.9  1.7  Engineering  5.0  0.7  Engineering  5.4  0.7  Material Science  6.0  1.2  Material Science  9.6  2.0  Material Science  11.8  2.4  Physics  4.6  0.7  Biochemistry & Molecular Biology  4.3  1.0  Engineering  6.0  0.8  S&T Other Topics  5.8  1.5  S&T Other Topics  6.3  1.6  Infectious Diseases  4.6  4.9  Neurosciences & Neurology  4.1  0.9  Biochem & Molecular Biology  4.5  1.0  Pharmacology & Pharmacy  5.7  1.8  Mathematics  5.5  1.9  Plant Sciences  4.5  3.5  Pharmacology & Pharmacy  3.6  1.2  Astronomy  4.3  4.3  Biochemistry & Molecular Biology  4.6  1.0  Biochemistry & Molecular Biology  5.0  1.1  S&T Other Topics  4.0  1.1  Public Env & OCC Health  3.6  1.9  Optics  3.4  2.5  Agriculture  3.8  2.2  Computer Science  3.7  1.6  General/Internal Medicine  3.7  1.2  Veterinary Sciences  3.4  3.4  Geology  3.2  2.8  Environmental Sciences & Ecology  3.4  1.1  Environmental Sciences &Ecology  3.4  1.1  Public Env & OCC Health  3.5  1.9  Material Science  3.2  0.7  Instruments Instrumentation  2.5  3.6  Biotech & Applied Microbiology  3.1  1.8  Pharmacology Pharmacy  3.3  1.0  Psychology  3.5  1.4  Brazil  %  RCA  Russia  %  RCA  India  %  RCA  China      South Africa  %  RCA  Climate Change And Disaster Mitigation      Water Resources & Pollution Treatment      Geospatial Technologies/Applications      New & Renewable energy/Efficiency      Astronomy      Thrusts: Bio & Nanotech; Energy; ICT; Health; Biodiversity & Amazon; Climate change; Space science; National security       Thrusts: Energy; Nuclear, Strategic ICT; Health; Space Science      Thrusts: Agric, Health, Energy, Transport & Infrastructure; Environment; Inclusion; Space science      Thrusts: Biotechnology; Food security; Energy sources/Materials; Clean vehicles; Climate change/ Environment      Thrusts: Biotechnology; Renewable energy; Climate change; Poverty alleviation; Space S&T      Agriculture  8.2  4.7  Physics  25.5  3.7  Chemistry  18.6  2.0  Chemistry  18.9  2.0  Chemistry  5.9  0.6  Chemistry  7.0  0.7  Chemistry  16.9  1.8  Physics  11.8  1.7  Physics  12.9  1.9  Environmental Sciences & Ecology  5.8  2.0  Physics  6.1  0.9  Mathematics  6.2  2.1  Engineering  10.1  1.3  Engineering  12.9  1.7  Engineering  5.0  0.7  Engineering  5.4  0.7  Material Science  6.0  1.2  Material Science  9.6  2.0  Material Science  11.8  2.4  Physics  4.6  0.7  Biochemistry & Molecular Biology  4.3  1.0  Engineering  6.0  0.8  S&T Other Topics  5.8  1.5  S&T Other Topics  6.3  1.6  Infectious Diseases  4.6  4.9  Neurosciences & Neurology  4.1  0.9  Biochem & Molecular Biology  4.5  1.0  Pharmacology & Pharmacy  5.7  1.8  Mathematics  5.5  1.9  Plant Sciences  4.5  3.5  Pharmacology & Pharmacy  3.6  1.2  Astronomy  4.3  4.3  Biochemistry & Molecular Biology  4.6  1.0  Biochemistry & Molecular Biology  5.0  1.1  S&T Other Topics  4.0  1.1  Public Env & OCC Health  3.6  1.9  Optics  3.4  2.5  Agriculture  3.8  2.2  Computer Science  3.7  1.6  General/Internal Medicine  3.7  1.2  Veterinary Sciences  3.4  3.4  Geology  3.2  2.8  Environmental Sciences & Ecology  3.4  1.1  Environmental Sciences &Ecology  3.4  1.1  Public Env & OCC Health  3.5  1.9  Material Science  3.2  0.7  Instruments Instrumentation  2.5  3.6  Biotech & Applied Microbiology  3.1  1.8  Pharmacology Pharmacy  3.3  1.0  Psychology  3.5  1.4  Table 2 provides a narrower view using 2014 data for publication counts, concentration in the top five research areas plus Astronomy, RCA, and research area h-index. When the research area count exceeds 10,000 publications, the h-index is calculated by displaying all country publications in the research area ranked by the number of citations from highest to lowest. The country h-index is then calculated by its inspection. Table 2. Whole count, concentration, RCA, and h-index (2014) Brazil N = 48498        Russia N = 34680        India N = 67517        China N = 275205        South Africa N = 14326          %  RCA  h    %  RCA  h    %  RCA  h    %  RCA  h    %  RCA  h  Agriculture  8  4.76  22  Physics  24  3.60  59  Chemistry  19  1.98  68  Chemistry  18  1.86  169  Infectious diseases  6  6.46  32  Chemistry  8  0.77  38  Chemistry  17  1.70  44  Physics  12  1.75  54  Engineering  14  1.74  102  Chemistry  6  0.60  30  Physics  6  0.92  52  Material science  7  1.25  36  Material science  10  1.83  51  Material science  12  2.25  138  Environment  6  1.84  31  Engineering  6  0.71  30  Engineering  6  0.81  28  Engineering  10  1.31  47  Physics  11  1.68  126  Engineering  5  0.64  27  Biochemistry  4  1.05  30  Mathematics  6  2.07  18  S&T other  7  1.32  54  S&T other  8  1.63  151  Physics  5  0.75  38  Astronomy  2  1.76  46  Astronomy  4  4.46  51  Astronomy  2  1.72  51  Astronomy  1  0.78  50  Astronomy  3  3.60  54  Brazil N = 48498        Russia N = 34680        India N = 67517        China N = 275205        South Africa N = 14326          %  RCA  h    %  RCA  h    %  RCA  h    %  RCA  h    %  RCA  h  Agriculture  8  4.76  22  Physics  24  3.60  59  Chemistry  19  1.98  68  Chemistry  18  1.86  169  Infectious diseases  6  6.46  32  Chemistry  8  0.77  38  Chemistry  17  1.70  44  Physics  12  1.75  54  Engineering  14  1.74  102  Chemistry  6  0.60  30  Physics  6  0.92  52  Material science  7  1.25  36  Material science  10  1.83  51  Material science  12  2.25  138  Environment  6  1.84  31  Engineering  6  0.71  30  Engineering  6  0.81  28  Engineering  10  1.31  47  Physics  11  1.68  126  Engineering  5  0.64  27  Biochemistry  4  1.05  30  Mathematics  6  2.07  18  S&T other  7  1.32  54  S&T other  8  1.63  151  Physics  5  0.75  38  Astronomy  2  1.76  46  Astronomy  4  4.46  51  Astronomy  2  1.72  51  Astronomy  1  0.78  50  Astronomy  3  3.60  54  The third data set comprises country co-publication rates (Table 3). The co-publication subject count (CSC) is obtained as: author address = ‘country X’ AND author address = ‘country Y’ AND publication period = ‘Z’. The two leading research area(s) for country pair co-publication are shown as percentages of country pair collaboration. The relevant h-index is also amended for the leading subject areas. Table 3. Co-publication count, concentration, and country h-index (2014)   Russia  h  India  h  China  h  South Africa  h  Brazil  N = 609    N = 625    N = 833    N = 349    Physics 65  48  Physics 43  45  Physics 43  46  Physics 32  32  Astronomy 23  34  Astronomy 16  31  Astronomy 15  35  Astronomy 13  22  Russia      N = 576    N = 1127    N = 295    Physics 57  47  Physics 47  50  Physics 41  34  Astronomy 31  48  Astronomy 16  36  Astronomy 34  44  India          N = 1183        Physics 31.0    Physics 19.4  Astronomy 12.8    Astronomy 11.1  China              Physics 32.0    Astronomy 13.9    Russia  h  India  h  China  h  South Africa  h  Brazil  N = 609    N = 625    N = 833    N = 349    Physics 65  48  Physics 43  45  Physics 43  46  Physics 32  32  Astronomy 23  34  Astronomy 16  31  Astronomy 15  35  Astronomy 13  22  Russia      N = 576    N = 1127    N = 295    Physics 57  47  Physics 47  50  Physics 41  34  Astronomy 31  48  Astronomy 16  36  Astronomy 34  44  India          N = 1183        Physics 31.0    Physics 19.4  Astronomy 12.8    Astronomy 11.1  China              Physics 32.0    Astronomy 13.9  All data were recorded in the first week of June 2017. 4. Data analysis The data are now analysed table by table. It is noted that Kahn (2015) previously observed the lack of correspondence between country national S&T thrusts and the BRICS thematic areas italicized. So, on one hand Brazil has a national thrust in climate change and is allocated the same BRICS thematic area, and China exhibits correspondence between the national thrust of ‘New energy sources & materials, Clean energy vehicles’ and the BRICS thematic area of ‘New and renewable energy; and energy efficiency’. On the other hand, for India one might have to stretch the correspondence between the national thrust ‘Space Science and Technology’ and the BRICS thematic area of ‘Geospatial technology and applications’. While important in and of itself, Russia’s thematic area ‘Water resources’ shows no obvious correspondence with her national thrusts. In the case of South Africa, one might very tentatively join the dots between the national thrust ‘Space Science and Technology’ and her BRICS thematic area of Astronomy. That prior analysis is now sharpened by extending the list of research areas and including the associated revealed comparative advantage (RCA) in Table 1. The value of considering data over 2009–14 is that it refers to the period immediately prior to the Cape Town declaration, and the country emphasis so recorded might have informed the choice of thematic areas. Even so, examination of the 10 leading country research area concentrations and RCAs sheds little further light on the basis by which the thematic areas were identified and allocated. Granted, with the exception of Astronomy, the thematic areas are both multi-disciplinary in nature, and strongly applied in orientation. The thematic area of Astronomy corresponds to a unique research area; whereas, the other four do not. Insofar as concentration goes, Astronomy is less studied except in the case of Russia. On the basis of high RCA, Brazilian agriculture and South Africa’s expertise in infectious diseases would be leading candidates, along with Russian Astronomy, yet these areas have not been prioritized with immediate action. Russia, India, and China show high concentration in Chemistry, Physics, and Material science, while for Brazil and South Africa concentration is more evenly spread. Table 2 presents 2014 data for the top five research areas in terms of concentration, RCA and country-level h-index. Except for South Africa, for which Infectious Disease emerges as an important research area, the restricted study period does not yield anything new. What is striking however is the h-index information that shows comparably high values for Physics and Astronomy. Noting Yong (2014), China, the BRICS country with largest volume of publications has much higher h-index scores for its top five research areas. China’s publication output is greater than the other four countries combined. Regarding Astronomy, the RCAs vary from Russia’s 4.46 and South Africa’s 3.60 down to China below world average at 0.78. By the measure of RCA, allocating Astronomy to South Africa ‘makes sense’. In Table 3, one notes the collaborative research areas of highest concentration, namely, Physics and Astronomy. For Brazil, Physics accounts for 6% of publications, but when it comes to Brazil’s co-publication in Physics with the other four countries the concentration leaps to 65, 43, 43, and 32%, respectively. Among the BRICS, the median level for co-authored papers in Physics is 32%. For Brazil, Astronomy accounts for 2% of its publications, but when it comes to co-publication with the other four countries the concentration leaps to 23, 16, 15, and 13%, respectively. For Astronomy, the median level is 16% of all co-publications. For Russia, that already exhibits high domestic concentration in Physics of 24%, co-authorship with the other four BRICS stands at 65, 57, 47 and 41%, respectively. Again, international co-authorship stands out above domestic activity. What then explains this observation? One may offer the conjecture that the concentration arises through the participation of BRICS scientists in mega-science projects of contemporary Physics and Astronomy, such as high-energy Physics research at CERN, Geneva, and the data analysis associated with the Planck satellite observatory through the Planck 2013 Collaboration. This conjecture may be verified by ranking abstracts of publications in Physics, and Astronomy by citation frequency. In Physics, for example, the publications cited at least five times or more since 2014 are overwhelmingly attributed to mega-science projects such as ATLAS, CMS, STAR, ALICE, and LHCb, with those in Astronomy including the Planck 2013 data analysis in addition to high-energy Physics projects already mentioned. This suggests that the bulk of BRICS collaboration in Physics and Astronomy takes place via the medium of international mega-science projects. The conjecture holds true for the co-authorship data obtained from Elsevier Scopus. In essence, the multi-author publications have high citation rates that pull up the h-index. The conjecture that collaboration in Physics or Astronomy/Astrophysics is dominated by mega-science projects is essentially supported. 5. Discussion and implications for policy The BRICS countries have laid out a declaration for collaboration with five thematic areas highlighted for attention. It turns out, however, that ‘collaboration’, is currently dominated through the mechanism of mega-science projects have not been generated through the desire of the BRICS countries to collaborate in S&T. It is found that with a single exception the dominant research areas of collaboration are different to individual country specializations. The difference arises from the participation of scientists in mega-science projects, where the rules of participation and scale of authorship create distortions in the visibility of these projects in the indexed literature. The disjuncture between such ‘collaboration’ and the intent of the Declarations raises questions of import to science policy, for the BRICS in particular and the measurement of scientific collaboration more generally. How shall one understand the influence of participation in mega-science projects on co-publication, let alone cooperation? Hand (2010) and Ebrahim et al. (2013) point to the (desirable) increase in publication counts resulting from international collaboration, while Birnholtz (2006, 2008) gives specific attention to the problem of attribution in the large projects at CERN, such as the ATLAS detector that involves technical, theoretical and experimental work. Such issues appear in all large research projects, including astronomy and the health sciences. Hogg et al. (2014) point to the emergence of well-crafted protocols that guide article writers in their task. Dance (2012) further discusses the issue of who should be the lead author. None of the above contributions consider the distortion of publication counts arising from the massification of research effort as in the 900 student authors2 of a paper on the genome of the fruit fly. For an in-depth analysis into the problem one may turn to King (2012) Participation in ATLAS, STAR, CMS, etc. requires competence in particle Physics, with participation coming at low cost, and being available remotely. The same holds true for working on Planck satellite data or the Sloan Digital Sky Survey. One can work on ATLAS anywhere, anytime, provided that one can get online. The typical ATLAS publication hereby involves hundreds of ‘authors’ most of whom do not know one another, who will never meet, and never correspond. ATLAS Authorship policy is carefully specified.3 Indeed, it is a condition of working with ATLAS data where all participants are named, so that massive co-authorship is designed into the governance protocol.4 Moreover, the official style guide requires that the word ‘ATLAS’ must be included in the article title, hence the ease of search (Eisenhandler 2013). The question this raises is what significance should be attached to co-publication as a proxy for cooperation? Can one really talk of BRICS collaboration in Physics and Astronomy or are the co-authorship counts a bibliometric artefact? ‘Collaboration’ among the BRICS is an unintended outcome of international mega-science projects that predate the establishment of the grouping and that do not directly relate to the agenda of the Cape Town/Brasilia Declarations. The ‘high’ levels of intra-BRICS collaboration in Physics and Astronomy are in part driven by the protocols pertaining to working with mega-science data. As a case in point, for generations South African science policy has laid claim to the ‘southern geographic advantage’ as a motivator to fund research in astronomy and palaeontology. Yet, the actual practice of Astronomy is overwhelmingly international, and quite independent of locality, the more so when a satellite observatory is used. Palaeontology is obviously geographically specific. A next step would be to examine the effect of other mega-science projects on co-publication: large terrestrial telescopes, projects such as the Global Burden of Disease, multi-site clinical trials, and genomic studies immediately come to mind. How then to adjust bibliometric analysis to account for such distortions? Should large-scale collaborations be placed in a separate bibliometric category lest their inclusion acts to distort and exaggerate actual peer-to-peer interaction? Should it be mandatory to introduce fractional counts when an article involves more than a specified minimum number of contributors? Would some type of scale factor serve to normalize the effect of mega-science? Given the huge dominance of ATLAS, CMS, STAR, ALICE and other large collaborations-based research in the revealed ‘collaboration’ among the five BRICS it might be more realistic to remove such counts and those associated with the Planck telescope entirely. This radical change would of course depreciate the reported level of co-publication among the BRICS. For the present one, it is cautioned that care has to be taken in interpreting co-publication counts lest one make claims for collaboration that are at best tenuous, and at worst misleading. This is not to suggest that the BRICS countries should limit their involvement in mega-science projects let alone restrict the development of such projects in their own countries. One thinks, for example, of large-scale nuclear research at Dubna in Russia, the projects like new optical telescope in China and the incipient Square Kilometre Array in South Africa. What these projects have in common is the generation of Big Data. The capacity of innovation systems large and small, advanced or emergent, in the BRICS or elsewhere, to clean, process, and analyse Big Data is now a core attribute for their success going forward. The development and sharing of the appropriate tools arise naturally through international collaboration, even where the collaborators operate remotely attaining ‘satisfaction at a distance’. These are many issues for future research. The evidence is that present BRICS S&T collaboration is dominant in Physics and Astronomy, and that this is taking place through mega-science. Highly cited BRICS collaboration in Physics does not entail small lab-based groups tackling problems of immediate value to national thrusts or the BRICS themes. This lack of alignment presents serious challenges for BRICS S&T policy specifically, and international S&T collaboration in general. Acknowledgements This work has taken shape over 2 years during which the Author benefitted from conference attendance supported by his host institution. The critical assessment of an external reviewer is gratefully acknowledged. References Bartol T., Budimir G., Dekleva-Smrekar D. et al.   ( 2014) ‘ Assessment of Research Fields in Scopus and Web of Science in the View of National Research Evaluation in Slovenia’. Scientometrics , 98/ 2: 1491– 504. Google Scholar CrossRef Search ADS   Birnholtz J. ( 2006) ‘ What Does it Mean to be an Author? The Intersection of Credit, Contribution, and Collaboration in Science’. Journal of the American Society for Information Science and Technology , 57/ 13: 1758– 70. Google Scholar CrossRef Search ADS   Birnholtz J. ( 2008) ‘ When Authorship Isn’t Enough: Lessons from CERN on the Implications of Formal and Informal Credit Attribution Mechanisms in Collaborative Research’. The Journal of Electronic Publishing , 11/ 1. DOI: 10.3998/3336451.0011.105. Bornmann L., Wagner C., Leydesdorff L. ( 2015) ‘ BRICS Countries and Scientific Excellence: A Bibliometric Analysis of Most Frequently-cited Papers’. Journal of the Association for Information Science and Technology , 66/ 7: 1507– 13. Google Scholar CrossRef Search ADS   BRICS. ( 2014) BRICS Science, Technology and Innovation Cooperation: A Strategic Partnership for Equitable Growth and Sustainable Development <http://www.brics.utoronto.ca/docs/140210-BRICS-STI.pdf> accessed 2 September 2015. Dance A. ( 2012) ‘ Authorship: Who's on First?’ Nature , 489: 591– 3. Google Scholar CrossRef Search ADS PubMed  De Bellis N. ( 2009) Bibliometrics and Citation Analysis . Lanham, MD: Scarecrow Press. DST. ( 2008) Ten-Year Plan for Innovation . Pretoria: Department of Science and Technology. Ebrahim N.A., Salehi H., Embi M.A. et al.   ( 2013) ‘ Effective Strategies for Increasing Citation Frequency’. International Education Studies , 6/ 11: 93– 9. Eisenhandler E. ( 2013) ATLAS Style Guide Version 2.4 . London: Queen Mary College. Finardi U. ( 2015) ‘ Scientific Collaboration between BRICS Countries’. Scientometrics , 102: 1139– 66. Google Scholar CrossRef Search ADS   Garfield E. ( 1979) ‘ Is Citation Analysis a Legitimate Evaluation Tool?’ Scientometrics , 1/ 4: 359– 75. Google Scholar CrossRef Search ADS   Hand E. ( 2010) ‘ Big Science Spurs Collaborative Trend’. Nature , 463: 282. (doi:10.1038/463282a Google Scholar CrossRef Search ADS PubMed  Hogg W., Donskov M., Russell G. et al.   ( 2014) ‘ Approach to Publishing for Large Health Services Research Projects’. Canadian Family Physician , 60/ 9: 854– 5. Google Scholar PubMed  Kahn M.J. ( 2015) ‘ Cooperation in Science, Technology and Innovation: Rhetoric and Realities’. Contexto Int. [Online] , 37/ 1: 185– 213. Google Scholar CrossRef Search ADS   Katz J.S., Martin B.R. ( 1997) ‘ What is Research Collaboration?’. Research Policy , 26: 1– 18. Google Scholar CrossRef Search ADS   King C. ( 2012). ‘Multiauthor Papers: Onward and Upward’. ScienceWatch Newsletter, 1– 4. <http://archive.sciencewatch.com/newsletter/2012/201207/multiauthor_papers/> Kumar N., Asheulova N. ( 2011) ‘ Comparative Analysis of Scientific Output of BRIC Countries’. Annals of Library and Information Studies , 58: 228– 36. Meissner D., Gokhberg L., Sokolov A (Eds) ( 2013) Science, Technology and Innovation Policy for the Future . Berlin: Springer. Google Scholar CrossRef Search ADS   MST. ( 2007) Science, Technology and Innovation for National Development Action Plan 2007-2010. Brasilia: Ministry of Science and Technology. NSB ( 2016) Science and Engineering Indicators 2016 . Washington, DC: National Science Board. Plume A., van Weijen D. ( 2014). ‘Publish or Perish? The Rise of the Fractional Author…’. Research Trends, 38. <https://www.researchtrends.com/issue-38-september-2014/publish-or-perish-the-rise-of-the-fractional-author/> Pouris A., Ho Y.-S. ( 2014) ‘ Research Emphasis and Collaboration in Africa’, Scientometrics , 98: 2169– 84. Google Scholar CrossRef Search ADS   Ronda-Pupo G.A., Sylvan Katz J. ( 2016) ‘ The Scaling Relationship between Citation-based Performance and International Collaboration of Cuban Articles in Natural Sciences’. Scientometrics , 107/ 3: 1423– 34. Google Scholar CrossRef Search ADS   Thomson Reuters, ( 2015) <http://wokinfo.com/essays/globalization-of-web-of-science/> Tijssen R., Waltman L., Van Eck N. ( 2012) ‘Research Collaboration and the Expanding Science Grid: Measuring Globalization Processes Worldwide’, pp. 1– 12. <http://arxiv.org/abs/1203.4194> UNESCO ( 2010) World Science Report 2010 . Paris: UNESCO. Wagner C. ( 2008). The New Invisible College: Science for Development . Washington, DC: Brookings Institution Press. Wagner C., Wong S. K. ( 2012) ‘ Unseen Science? Representation of BRICs in Global Science’. Scientometrics , 90: 1001– 13. Google Scholar CrossRef Search ADS   Waltman L., Tijssen R., Van Eck N. ( 2011) ‘ Globalisation of Science in Kilometres’, Journal of Informetrics , 5/ 4: 574– 82. Google Scholar CrossRef Search ADS   Yi Y., Qi W., Wu D. ( 2013) ‘ Are CIVETS the Next BRICs? A Comparative Analysis from Scientometrics Perspective’. Scientometrics , 94: 615– 28. Google Scholar CrossRef Search ADS   Yang L. Y., Yue T., Ding J. L., Han T. ( 2012) ‘ A Comparison of Disciplinary Structure in Science Between the G7 and the BRIC Countries by Bibliometric Methods’, Scientometrics , 93: 497– 516. Google Scholar CrossRef Search ADS   Yong A. ( 2014) ‘ Critique of Hirsch's Citation Index: A Combinatorial Fermi Problem’, Notices of the American Mathematical Society , 61/ 9: 1040– 50. Google Scholar CrossRef Search ADS   Footnotes 1 http://ipscience-help.thomsonreuters.com/inCites2Live/filterValuesGroup/researchAreaSchema.html 2 <http://www.sciencedaily.com/releases/2015/05/150511095353.htm> 3 <https://twiki.cern.ch/twiki/bin/view/Main/ATLASAuthorshipPolicy> 4 The author is grateful for these insights to an anonymous physicist with work experience at CERN. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

Journal

Science and Public PolicyOxford University Press

Published: Feb 1, 2018

There are no references for this article.

You’re reading a free preview. Subscribe to read the entire article.


DeepDyve is your
personal research library

It’s your single place to instantly
discover and read the research
that matters to you.

Enjoy affordable access to
over 12 million articles from more than
10,000 peer-reviewed journals.

All for just $49/month

Explore the DeepDyve Library

Unlimited reading

Read as many articles as you need. Full articles with original layout, charts and figures. Read online, from anywhere.

Stay up to date

Keep up with your field with Personalized Recommendations and Follow Journals to get automatic updates.

Organize your research

It’s easy to organize your research with our built-in tools.

Your journals are on DeepDyve

Read from thousands of the leading scholarly journals from SpringerNature, Elsevier, Wiley-Blackwell, Oxford University Press and more.

All the latest content is available, no embargo periods.

See the journals in your area

Monthly Plan

  • Read unlimited articles
  • Personalized recommendations
  • No expiration
  • Print 20 pages per month
  • 20% off on PDF purchases
  • Organize your research
  • Get updates on your journals and topic searches

$49/month

Start Free Trial

14-day Free Trial

Best Deal — 39% off

Annual Plan

  • All the features of the Professional Plan, but for 39% off!
  • Billed annually
  • No expiration
  • For the normal price of 10 articles elsewhere, you get one full year of unlimited access to articles.

$588

$360/year

billed annually
Start Free Trial

14-day Free Trial