Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Life scientists’ views and perspectives on the regulation of dual-use research of concern

Life scientists’ views and perspectives on the regulation of dual-use research of concern Abstract In the early 2000s, several publications initiated a debate about the potential misuse of academic life science research. The debate was refueled in 2012, when two studies describing the engineering of human transmissive H5N1 bird flu virus were published. To facilitate the debate on dual-use research of concern (DURC) and regulatory measures, we interviewed life science researchers working in Switzerland about their views on DURC. The results indicate that all scientists interviewed were aware of the debate, however, few had reflected about dual-use aspects with regard to their own work. Although all respondents believed in freedom of research, a majority was supportive of some form of regulation of DURC. This article discusses the major implications of the study, especially regarding the implementation of regulatory measures. In addition, preliminary recommendations are given for raising awareness on DURC in the life sciences among researchers. 1. Introduction In 2012, the publication of two research articles on the creation of airborne human transmissible bird flu viruses (Herfst et al. 2012; Imai et al. 2012) has revived an intense debate on the misuse potential of life science research (Nature 2015). During a year-long, self-imposed moratorium, various actors including the scientific community, ethicists, security experts, government officials, the public, and international organizations discussed whether gain-of-function (GoF) research on highly pathogenic avian influenza should be conducted and published (Casadevall and Shenk 2012). While the moratorium has been lifted, the debate is ongoing. Its focus, meanwhile, expanded from concerns about the potential misuse of research results by malevolent actors (biosecurity issues) to concerns about the risks of an accidental release of engineered pathogens (biosafety issues) (Imperiale and Casadevall 2015). As a result of these events, the US government has implemented a funding pause on certain types of GoF research in order to assess potential risks and benefits of such research (The White House Office of Science and Technology Policy 2014). The ban covers GoF studies that enhance the pathogenicity or transmissibility among mammals by respiratory droplets of influenza, Middle East Respiratory Syndrome (MERS) virus or severe acute respiratory syndrome (SARS) virus. While supported by some scientists (Lipsitch and Galvani 2014), the ban has provoked an outcry by others who fear scientific setbacks and potential negative consequences for public health and pandemic preparedness (Reardon 2014) . While the pause is still in effect and how to best address the threat is still a matter of debate, new GoF studies are being published on a regular basis (Sutton et al. 2014; Wei et al. 2014; Kong et al. 2016). In 2013, the US Department of Health and Human Services (HHS) established a framework on dual-use research of concern (DURC),1 which requires a case-by-case review of Highly Pathogenic Asian Avian Influenza A virus (HPAI) Hemagglutinin 5 Neuraminidase 1 (H5N1) GoF proposals submitted to US funding agencies. Other countries have so far left the regulation of the dual-use problem entirely in the hands of the scientific community. Scientific societies, funding organizations, journals and research institutions worldwide have released statements, established procedures, and published code of conducts (or ethics) in order to address DURC and GoF, including the European Academies Science Advisory Council (EASAC) (Fears and Ter Meulen 2015). Despite commendable initiatives such as the dual-use guidelines for EU funding, the absence of a general framework at EU level created practical problems: the Dutch government applied export control law for dual-use items (Council Regulation 428/2009) in 2012 in order to control the release of the H5N1 paper submitted for publication by the Dutch researcher Ron Fouchier (Enserink 2013). The law was originally designed to prevent the proliferation of nuclear, biological, and chemical weapons, thus banning the shipping of goods, pathogens, and technologies that could be used for nefarious purposes. It was the first time it was used to control the publication of a scientific paper. The use of export controls for this purpose has been rejected by experts in nonproliferation and by scientific experts alike as unfeasible (Enserink 2013; Palu 2014). In order to tackle the dual-use problem, researchers need to articulate and debate their views about what risks are acceptable and what research is truly needed. However, as European Society for Virology (ESV) and American Society for Microbiology (ASM) representatives recently pledged, there is also a need for increased stakeholder involvement beyond the scientific community, especially among ethicists, biosecurity experts, and public health representatives (Palu 2014; Imperiale and Casadevall 2015). There is a wide agreement that those scientists associated with controversial research cannot and should not be left alone with the decision whether to publish or conduct such research (Fears and Ter Meulen 2015; Imperiale and Casadevall 2015; Frank et al. 2016). This is especially important since previous studies with life science researchers—conducted before the publication of the H5N1 studies—have shown that most participating researchers were not even aware of the current debates and concerns about dual-use research (Dando and Rappert 2005; Kelle 2007; National Research Council 2009). Moreover, a qualitative study predominantly conducted at UK institutions revealed that most researchers did not regard bioterrorism or bioweapons as a substantial threat and did not believe in the contribution of life science developments to the problem of biological weapons and bioterrorism (Dando and Rappert 2005). A quantitative study led by the US National Research Council (National Research Council 2009) showed that most researchers did not regard their own research as dual-use even if it was categorized as ‘experiments of concern’ as described in the so called ‘Fink report’2. When asked about governance options, a majority of life scientists did not support regulations to control DURC (Dando and Rappert 2005; National Research Council 2009). However, a study investigating the impact of dual-use controls amongst UK life scientists working with agents listed in Schedule 5 of the 2001 Anti-Terrorism, Crime, and Security Act found that a majority of interviewees viewed regulation of DURC as worth considering, particularly through increased screening of personnel working with dangerous pathogens (McLeish and Nightingale 2005). The same study suggests that, at least in the UK, part of the scientific community was prepared to expend considerable effort on improving risk management with regards to DURC. In fact, in order to effectively implement regulatory measures, it is essential that researchers are aware of DURC issues and supportive of regulations. Thus, to get an insight into life scientists’ awareness, views, and perspectives on DURC in the aftermath of the H5N1 cases, we have interviewed international life scientists working within the fields of microbiology and virology/cell biology. In contrast to previous studies, all interviewees of our study were aware of the DURC debate. This article presents a summary of the main points raised and discussed during the interviews, including the types of regulation supported by the interviewees, the interviewees’ thoughts about the likelihood of misuse and feasibility of replicating sensitive research results, as well as their opinions about the publication and communication of sensitive research. Subsequently, we discuss the interviewees’ personal judgment about the extent to which individual scientists can be made accountable for actual misuse of research. Finally, we connect our results to the wider debate on DURC, discuss the major implications of our research results, and provide some preliminary recommendations for raising awareness on DURC in the life sciences through open debates, specific educational programs, and weighted regulatory interventions. 2. Methods We conducted and analyzed nineteen qualitative interviews with life scientists working in Switzerland. All scientists interviewed were actively working within the fields of microbiology or virology/cell biology. Purposive sampling was adopted in order to obtain a diverse selection of scientists from a selection of academic institutions (universities and other research institutions with microbiology/virology research groups) with diverse gender, age, nationality, and professional experience characteristics (PhD students, postdoctoral researchers, group leaders, and full professors). A total of twenty researchers were purposively selected from the homepages of the research institutions according to their research profile. Participants were contacted via e-mail outlining the research. The email contained the following information: (1) title of the study: ‘Attitudes, views and opinions of Swiss researchers in the life sciences working with pathogens towards biosafety and biosecurity issues’, (2) the foundation which financed the study, (3) the approximate length of the interviews, (4) anonymization of the interview content, as well as (5) the invitation to participate. During approximately one-hour long face-to-face interviews conducted at their institutions or at neutral meeting places, participants answered questions about biosecurity and biosafety issues associated with dual-use research in the life sciences. Besides the interviewers and the interviewee nobody else was present during the interview. The interviews were recorded in the time period between October 2012 and February 2013 using the Audacity software.3 In addition to sets of questions about their knowledge and attitudes about dual-use research, interviewees were given the details of two case studies: the H5N1 engineered bird flu virus and the mouse pox virus ( Jackson et al. 2001; Herfst et al. 2012; Imai et al. 2012), and invited to share their opinions about such case studies. Additional questions explored the issue of regulation, in particular, the problem of whether and in what form dual-use research should be regulated. The interview guideline was pilot-tested and was adapted during the first interviews. The interviews were conducted by the first author (S. E-G.). Interviews were transcribed verbatim in the original language spoken during the interviews (English, German, and Swiss German dialects) and were analyzed with the support of the analysis program Atlas.ti, Version 7.0. Participants were given the opportunity to review their interview transcripts. However, no participant made use of this option. A repetition of one or more interviews was not necessary. To improve comprehensibility, the language and grammar of the quotes used in the article were slightly adapted while preserving the original message. 3. Results Our analysis of the interviews with researchers in Switzerland identified four main recurrent themes related to the following questions: (1) Whether and how dual-use research may be regulated, (2) What is the probability and actual feasibility of misusing research results, (3) How research results should be communicated (to peers and public), (4) Whether researchers act responsibly or can be taught to do so. Conceptually, question 1 is linked to the issues of biosecurity and regulation; question 2 to the issues of misuse and replicability; question 3 is to the issues of publication, communication, and public opinion. Finally, question 4 regards the notion of responsibility and the role of ethics. Each of these core themes was further analyzed in detail. Although our interview study is limited in regards to sample size and generalizability, we believe that it provides valuable insights into the perspectives and views of the life science community. 3.1 Biosecurity At the conceptual level, the notion of biosecurity was confused with or conflated upon the notion of biosafety on more than one occasion. However, when the notion of biosecurity was explained or became intelligible to the interviewee, the relevance and significance of the debate over biosecurity and dual-use was immediately recognized by most researchers P1: ‘We have to discuss biosecurity; that is quite clear.’ While there was wide agreement about the necessity to discuss biosecurity issues, almost all researchers did not consider biosecurity4 aspects regarding their own work. The main reasons given by the interviewees for not having considered biosecurity issues were the low pathogenicity/virulence of the pathogen used in their research, its host spectrum (non-human pathogen), and the kind of research conducted (pathogen–host interactions, as well as characterization of genetic factors instead of targeted genetic engineering of the pathogen). One interviewee pointed out that the discussion was centered on the wrong issue. According to this person, biosafety issues are much more important and need to be discussed primarily when manipulating or creating human pathogens. This interviewee claimed that the threat of misuse is infinitely smaller compared to the potential of an accidental release of such a pathogen. 3.2 Regulation Regarding the issue of regulating research with possible dual-use potential, some interviewees questioned whether regulation per se can be considered a viable way of preventing risks. The practical impossibility of implementing and enforcing fair and functional regulations were given as main reason. For example, it was argued that it would be difficult to identify risky research and that such attempts could possibly result in an unfair blocking of someone’s research. Another issue raised by interviewees was the lack of means for sanctioning and thus stopping unwanted research. P10: ‘Are universities prepared to sack a professor as a consequence? What other options for sanctioning are there? I can cut their funding, however, that equals sacking. Do I want to lose the expert that may receive the Nobel Prize one day? … The idea [of regulation] is good; I just have doubts about its implementation ….’ Another common argument was that, in order to regulate efficiently, it would be necessary to implement similar regulations worldwide, which was considered difficult or even impossible. P10: ‘Science is dynamic, I mean, if one implements a strong regulation in one country prohibiting certain things, then one cannot prevent such research in other countries, which did not implement such regulation. I imagine it to be difficult to introduce a world-wide ban on certain research.’ Nevertheless, few interviewees argued in favor of no-regulation scenarios, for example, with risk-management being entirely in the hands of the individual researcher. The majority of interviewees recognized the value of regulation (see Figure 1). Their opinions, however, diverged with regard to the regulatory framework to adopt and the regulatory bodies entitled to enforce such regulation. Figure 1. View largeDownload slide Options for the regulation of DURC. The majority of interviewees (17 out of 19) indicated to be generally in favor of regulation. Of those more than half provided details on the preferred form of regulation. Figure 1. View largeDownload slide Options for the regulation of DURC. The majority of interviewees (17 out of 19) indicated to be generally in favor of regulation. Of those more than half provided details on the preferred form of regulation. The following regulatory frameworks were proposed: (1) case-by-case or (2) through general rules, implemented at different stages of the project, for example, at the (1) funding- and/or (2) publication-stage. P12: ‘I think that [regulation] also has to be put in place at the beginning of a research project. Because a grant proposal could turn into nothing, could be a complete failure, or could blow up into something like this. And I think, yes, that’s why it is the responsibility of the funding bodies. And I think, the journals …, I think it's good that they are there.’ While some interviewees argued in favor of a case-by-case framework with no general categorizations to allow for greater flexibility in research, others were in favor of general frameworks with e.g. general rules, codes of conduct, and definitions of sensitive research. Responses diverged with regard to the regulatory bodies entitled to enforce such regulations. Most researchers preferred self-regulation through institutions of the scientific community. The following options were advanced and discussed (1) Independent advisory body, (2) National funding agency, (3) Institute/University, (4) Government, and (5) Journals. One researcher, who argued for control on the level of publication, said P11: ‘There has to be some kind of agreement between publishers, there has to be some kind of general rule which applies to everybody. For example, … there has to be a clear definition of what is classified, … and there has to be some independent body that receives this kind of classified (information) from every journal. And they should decide (about who can access classified information). There has to be some independent body, independent of the journals, there has to be one superior level.’ Most importantly, regulation was seen in possible conflict with the principle of freedom of research. The activity of regulatory bodies, as recurrently addressed, should not limit or temper scientific freedom and scientific progress, but simply prevent significant risks of misuse. Science, many interviewees argued, cannot be feasibly stopped and should not be stopped, since the individual and collective freedom of scientists is a crucial determinant of scientific innovation and progress: P11: ‘The scientific system should control sensitive information better, but not at the cost of the benefits. It shouldn't stop research. What I mean is, I am somebody who supports genetic engineering and such things, but with tighter regulation. You cannot stop technology. You cannot stop science. I strongly believe that you just cannot stop science. So all you can do is control and be open about how you control things.’ An important theme in this context was the irrepressible character of the scientific endeavor, both in a descriptive and normative sense, making regulation substantially more difficult or even impossible as many interviewees argued. P7: ‘I think saying this should not be done or this should not be published is very difficult. Because it's something that is said before things are done, and you never know what will come out.’ In one occasion, the clash between the golden value of scientific freedom and the need for regulation generated a moral dilemma: P15: ‘I find it very difficult. The scientist in me says we should publish everything because we want to reveal things and the goal is to understand how exactly phenomena occur …. On the other hand, this is not without danger. At the end of the day, having some kind of committee entitled to decide yes or no, I would not know how – in a fair and independent manner – how there should be such a thing ….’ In this context, social responsibility linked to systematic regulatory models was identified as a possible solution to overcome the problem of unfair and biased regulation. In several occurrences, submitting research designs or results with dual-use potential to regulatory authorities on the initiative of the researcher him/herself was seen as a form of returning social responsibility to the individual scientist. For example, one interviewee argued P2: ‘I would have no problem if the granting agency would ask me after my proposal to explain the ethical issues of my own project, because this is a way to return the responsibility to me.’ While a certain degree of regulation was generally favored, potential pitfalls associated with excessive or ineffective regulations were frequently addressed. Potential dangers associated with regulatory interventions were linked to the problem of increasing bureaucracy and seen as an administrative obstacle to the generation of innovative scientific research: P2: ‘It's a bit scary, because you add one layer of control and that means administration.’ An overview of the interviewees’ opinions on the regulation of DURC is presented in Figure 1. 3.3 Misuse The potential for misuse was seen as a ubiquitous aspect of current bioscience research, based on the premise that virtually any piece of knowledge or technology can be, in principle, co-opted for nefarious purposes by malevolent agents. P3: ‘There will always be a risk. There will always be people who want to abuse something or who have malicious intentions and want to misuse research results. This ambivalence is not under our control that’s quite clear.’ Even more generally, the potential for misuse was not limited to research results but seen as an unavoidable feature of many kinds of information or objects, for example knifes, cars, and explosives. However, interviewees’ opinions diverged with regard to the identification of potential malevolent actors as well as to the prediction and assessment of risk. Three possible types of malevolent actors were identified on a gradual scale going from (1) individual agents to (2) organized groups, up to (3) governments and states. Many interviewees evaluated the possibility of misuse through individual actors or small groups as probabilistically low based on the recurrent assumption that bioterrorist projects, with the potential to cause serious harm to the population, require large-size and highly sophisticated infrastructures. These infrastructures, many interviewees observed, due to their volume and high financial costs, cannot be built and managed by individual actors or small groups, but only by large and well-funded organized groups, terrorist networks, or governments. As one interviewee argued: P1: ‘You need some kind of infrastructure, and I think that a small group can’t go that far. I could imagine, theoretically, if you think about the worst case scenario, that a state could do this somewhere secretly.’ One interviewee pointed out that strict biosafety measures need to be implemented to launch a bioterrorist attack against humans, which, in turn, poses an infrastructural problem in terms of a sterile, lab-like working environment and high-tech equipment. P7: ‘At first they need the infrastructure. They need a lab. And they are working with pathogens that can be transmitted to humans apparently. So they can't work in a lab like here (biosafety level 1-2), otherwise they could, they would get infected, and they could also die. So the infrastructure is not something that can be obtained easily, actually.’ In addition, many interviewees observed that there are much easier and more effective alternatives than bioterrorism; alternatives with the potential to cause selective or indiscriminate harm to the population. According to this stream of thought, potential microbiology-based bioterrorist practices would hardly be worth the effort in terms of financial cost, required equipment and expertise, as well as magnitude and predictability of the outcome as compared to other forms of terrorism. For example, an interviewee argued: P10: ‘I mean, if you have a cell culture and the respective molecular biology know-how, you can do that (produce the human transmissible bird flu virus), yes. But I mean there are so many things you can do. You can even make explosive substances by yourself, you just need to go on the internet and find all information about how to do it.’ 3.4 Replicability Our results show that the practical feasibility of bioterrorist attacks is not the only perceived factor at play in biosecurity risk assessment and management. Issues of replicability are critically relevant, too. Interviewees pointed out that, besides financial investments, equipment and infrastructures, as well as socio-cognitive factors such as expertise, knowledge transfer, and education are equally important. Interviewees diverged with regard to what level of education is required to effectively perform bioterrorist activities. While some argued that a master’s degree might be sufficient to provide malevolent actors with the necessary knowledge to pose biosecurity threats, the dominant position was that a PhD degree, hence a certain level of experience in laboratory research, would be required: P3: ‘A well-educated PhD student could do it (replicate the production of human transmissible H5N1 virus), one who works in virology and knows how to do mutations, and possibly works within a project involving viral mutations and recombinant viruses.’ The element of training was repeatedly underscored by interviewees, often in conjunction with the access to some tacit knowledge5 implicitly shared among microbiology/virology researchers and inherent to their routine laboratory work. As one argued P12: ‘If you're not trained it's impossible to do what we're doing. It's not difficult to get the knowledge. I'm not saying that we are geniuses. But you need training.’ 3.5 Publication The question of how sensitive research results should be communicated was strongly linked to the issue of scientific publication. A recurrent attitude among interviewees was to underscore the importance of scientific publications for science and the generation of knowledge. This idea was repeatedly expressed both at the descriptive and the normative level, in conjunction with the highly valued principle of freedom of science. A safety-motivated or security-motivated limitation of the publication of scientific results was dominantly seen as an obliteration of the freedom of science principle. One interviewee argued: P4: ‘Research is all about difficult questions. I would personally prefer to have free research and free publication.’ The conflict between the normative principle of freedom of research and the pragmatic problem of risk assessment was observed to result in a moral dilemma tackling the essence of the entire scientific enterprise. As one interviewee reported: P4: ‘My first reaction was: this is dangerous. However, if you look at the issue from a scientific standpoint, you need to live with this knowledge. Otherwise we can no longer do science. If you put restrictions on this, I don’t know where science goes.’ Results show that the same consequentialist and pragmatic approach was prevalent when facing the issue of censoring (key details of) the methods section when publishing highly sensitive research results, as recommended back in 2012 by the US National Science Advisory Board for Biosecurity (NSABB) (Fouchier et al. 2012) for the human transmissible bird flu virus H5N1. Most interviewees opposed this decision, underscoring its practical ineffectiveness: P1: ‘Censoring the methods section is a really, really weak measure. If you can’t figure out the methods on your own, you must be quite bad’. P11: ‘You don't really need the methods. The methods can be found online. So if you know the mutations, you know the methods. You just need the ingredients, which are not impossible to get, if you just spend some time and money.’ In a few occurrences, the importance of publishing research results unrestrictedly was not only advocated based on the normative principle of freedom of science, but also on pragmatic and consequentialist grounds. For example, few interviewees underscored how ‘knowledge is the only thing that gives us protection.’ As one interviewee argued P1: ‘Knowledge is our best defense and the lack of knowledge is the biggest danger; when it comes to biosecurity the greatest problem is ignorance.’ According to this person, few scientists in the world are familiar with the details of how to weaponize pathogens. Thus, it was argued that research with certain agents should be pursued to preserve the knowledge essential to distinguish sophisticated and effective from improvised and non-effective biological weapons. According to the interviewee such distinction between different forms of weaponization could mitigate or prevent unjustified panic in cases of non-effective attacks. 3.6 Communication and public opinion With regard to the issue of communicating research results and engaging with the public, interviewees highly underscored the importance of science dissemination and the public understanding of scientific research. Some researchers suggested that this could be obtained by having scientists actively involved in public communication P6: ‘They should have scientists as a public communication partner. Universities should do that (communicate scientific outcomes), not scientists. It's the duty of the university and they should do far more there. Explain, reduce the complexity, because this is a real job, it's a real task.’ The issue of public communication was frequently linked to the issue of social responsibility in science as well to the relationship between science and society. Interviewees often implicitly assumed that society is entitled a collective right-to-know, that is, the right to be collectively and adequately informed about the latest scientific updates. Based on this right, society should be entitled to play an active role in the decision-making about biosecurity P7: … ‘As scientists it is our responsibility to inform society. I do believe that society has to know what we are doing. For me it's not a problem that a commission would say “this should be done” or “this should not be done”. It is society, actually, that should decide. But if scientists don't know what is done in other labs, they cannot tell the population or society what is going on in certain labs. So for me that's why it's very important that there is communication and transparency. In the end society should know. For me it's the most important thing that scientists should do.’ However, many interviewees suggested that it will hardly be possible to eradicate sensationalism, which was suggested to have fuelled societal fears with regards to the H5N1 publications. According to those interviewees, scientists trying to advertise their research were partly responsible for the exaggerated communication of the H5N1 research results. 3.7 Responsibility With regard to the issue of responsibility, our results reveal conflicting attitudes. On the one hand, a dominant position among interviewees was to not consider individual scientists responsible for future misuse of their research results, in the light of the high number of uncontrollable variables affecting the reuse of information by third parties: P7: ‘The people that invented cars are not responsible for the deaths of people dying in car accidents.’ On the other hand, however, the individual responsibility of scientists was recognized and, in one occasion, even described as a self-evident implication of research. As one interviewee argued P2: ‘If one day something goes wrong, it would clearly be my responsibility.’ The complexity of such conflicting attitudes was further revealed by the interviewees’ opinions related to the issue of responsibility awareness. A dominant view among interviewees was to maintain that a few scientists do not have sufficient awareness of their social responsibility, and that they are sometimes driven by selfish motives leading them to make sensational claims with the purpose of maximizing their funding opportunities and diverting public attention onto their research: P6: ‘And some people, when writing a grant, of course they say it’s the most important topic on earth, because it's the most dangerous bug on earth.’ In response to this phenomenon, many interviewees have favored the development of strategies for raising or strengthening awareness among researchers about their social responsibility, hence their duty to engage in dialogue with the public and collaboratively address biosecurity issues: P2: ‘I think every scientist should ask these questions: Why am I doing this; Am I doing only good for mankind by studying this, or could it be bad in some respects?’ As one interviewee concisely argued P4: ‘There is a need for responsible communication.’ In general, the interviewees showed a prevalent interest in ethics and the recognition of the potential positive value of ethical training when facing biosecurity issues. Hence they favored the inclusion of ethics in the educational curriculum of scientists. Some interviewees argued that ethics training will not lead to a change in behavior since the way people act depends mainly on personality. No interviewee reported to have received training in dual-use issues during their own curricular or extra-curricular education. Finally, our results also show that training in ethics, in general, was frequently missing among researchers P7: ‘In my case it was definitely something that was missing. … I think it is important for scientists to really know that what they are doing can be used in one way or can be used in another way.’ An overview of the interviewees’ opinions on the importance of including education on ethics and biosecurity issues into life science curricula is presented in Figure 2. Figure 2. View largeDownload slide Opinions on including ethics education with biosecurity aspects into life science curricula. The majority of interviewees (14 out of 19) were in favor of including ethics education into life science curricula, especially courses that discuss biosecurity issues. Two interviewees did not share their opinion on this matter. Figure 2. View largeDownload slide Opinions on including ethics education with biosecurity aspects into life science curricula. The majority of interviewees (14 out of 19) were in favor of including ethics education into life science curricula, especially courses that discuss biosecurity issues. Two interviewees did not share their opinion on this matter. 4. Discussion Our interview study, although limited in regards to sample size and generalizability, provides valuable insights into the perspectives and views of the life science community. In contrast to studies conducted before the publication of the controversial H5N1 research (Dando and Rappert 2005; Kelle 2007; National Research Council 2009), our study results show that all scientists interviewed are well-aware of the DURC debate. Interviewees indicated to have followed the media coverage of the H5N1 controversy, in part due to their professional involvement with infectious agents. However, despite their awareness of the issue, the overwhelming majority of interviewees revealed a tendency to neglect biosecurity aspects with regard to their own work, showing how self-reflective attitudes might be less common and harder to internalize than general theoretical considerations. Thus, our results indicate that more effort is needed to facilitate reflection about dual-use issues regarding the scientists’ own work. When asked about their reasons for not having considered the biosecurity implications of their own research, most interviewees explained that they were not involved in GoF research. This is formally confirmed by the fact that none of their projects involved GoF or could be identified as DURC, as classified by the NSABB (NSABB 2007). There are, however, a variety of life-science experiments and technologies with dual-use implications, for example, the novel gene-editing technology CRISPR/cas96 (Gurwitz 2014; Clapper 2016), which do not fall under the current definitions. This was noted by one interviewee who recognized the dual-use potential of his/her work during the interviews.7 One reason why researchers might not reflect about biosecurity issues could lie in the narrow formal definition of DURC provided by the NSABB. In contrast to biosecurity issues, all interviewees had reflected about biosafety issues, which may in part be explained by the tight biosafety regulations in Switzerland (Schweizerische Eidgenossenschaft, Fassung vom 2015). When discussing biosafety and biosecurity issues, it became apparent that there was a major conceptual confusion between these two notions. We presume the reason for this confusion is twofold: first, many interviewees were German native speakers and in the German language the same noun (Biosicherheit) is used to refer both to biosafety and biosecurity. Therefore, the linguistic underpinning of some interviewees might have influenced their conceptual categories. Second, the conceptual confusion of some interviewees (including non-German speakers) might reflect a degree of conceptual ambiguity inherent in the biosafety–biosecurity debate, as already observed in media reports on this topic and even in the expert debate (Nordmann 2010; Imperiale and Casadevall 2015). However, this distinction became easily intelligible to the interviewee after the interviewer’s prompts. This shows that while some researchers may not be familiar with the relevant vocabulary or may even experience conceptual confusion, they still share a fair understanding of the problematics. These results are consistent with previous studies that recognized a similar discrepancy between the semantic and the conceptual level among other speech communities, such as Japanese researchers. The significance of the debate over dual-use in relation to both, biosecurity and biosafety, was immediately recognized by almost all interviewees. At the same time, all interviewees favored freedom of research and understood it as a core value of the scientific enterprise: research should and could not be stopped. Regulation was, however, accepted as a strategy to minimize unintended risks and collateral harms in DURC. Though hard to implement, regulatory strategies aimed at minimizing risks were welcomed by many interviewees, albeit the probability of occurrence of those risks was considered reportedly low. Interviewees based their judgment on financial and infrastructural limitations as well as on the easy accessibility of simpler alternatives for terroristic attacks. This seems to suggest that, by most interviewees, regulation per se was not perceived in direct conflict to free research, providing a more mitigated picture of the tension between freedom of research and national security identified in previous literature (Kennedy 2008; Minehata and Shinomiya 2010; Suk et al. 2011). A minority of interviewees, however, rejected any kind of regulation of dual-use issues and suggested to leave the responsibility to conduct or not conduct certain research in the hands of individual researchers or research groups. For all others, the real tension was rather seen between freedom of research, on the one hand, and suboptimal forms of regulation, on the other hand. From the perspective of interviewees these forms included regulatory interventions involving disproportionate administration, excessively pervasive, and strict oversight, or the lack of scientific expertise. Even when the clash between scientific freedom and the need for regulation generated a moral dilemma (P15), such dilemma seemed to be partly resolved by the weighted and thoughtful use of regulation. As responses diverged with regard to the regulatory bodies entitled to enforce such regulations, we infer that the main issue at stake in the context of DURC regulation is not whether DURC should be regulated but how it should be regulated. The fact that most researchers preferred self-regulation through institutions of the scientific community might indicate a preference for more narrow and small-scaled regulatory strategies. Large-scaled regulatory bodies might be seen as vehicles of disproportionate administrative work and tighter oversight, besides being out of the researchers’ control. Not surprisingly, the fear of increased bureaucracy, administrative obstacles and consequent delay in project development represent recurrent themes among interviewees. The high prevalence of feasibility concerns shows how the practical dimension of regulation can hardly be decoupled from its theoretical dimension. One main concern of interviewees was to find the right (scientific) expertise in order to be able to judge the biosafety and biosecurity threats associated with the particular research. Nevertheless, many interviewees argued in favor of establishing an independent advisory board at the national level. Besides scientific experts, our interviewees suggested to include ethicists, security experts as well as representatives from the public. Doubts about the effectiveness of implemented interventions seemed to be another major reason behind the general skepticism towards regulatory measures at the level of scientific publishing. For example, the suggestion of allowing publications while censoring the methods sections was largely seen as an ineffective strategy, since methods can be easily inferred from other sources such as textbooks (Engel-Glatter 2014). From the interviewees’ perspective, accordingly, the decision of the NSABB (Fouchier et al. 2012) to publish the papers about the human transmissible bird flu virus H5N1 but omit the methods was seen as a fruitless or even counterproductive strategy. Similar outcome-oriented evaluations were advocated when discussing the issue of risk assessment. It is important to note how the degree of risk perception and risk identification may vary among different stakeholders. For example, one interviewee provided an interesting example of how the societal and even professional perception of danger regarding a bioterrorist attack with a sophisticated bioweapon may vary: by recalling the panic created by the 2001 anthrax attacks in the USA, when postal letters containing anthrax spores were delivered to politicians and news media organizations. The interviewee underscored the difficulty of weaponizing anthrax. According to the interviewee, the production of infectious anthrax spores requires high expertise in microbiology as well as specialized large-sized and, thus, expensive equipment; hence it is hardly achievable outside the military setting. This description is in opposition to the assessment of other interviewees who underscored the alleged ease and feasibility of producing and weaponizing anthrax spores. Thus, the example illustrated how the societal or even professional perception of danger may be disproportionate to the actual threat. In order to prevent ‘misplaced’ societal fears associated with DURC, many interviewees saw communication and public understanding of science as a critical factor. Some interviewees argued that it is in the hands of society to discuss and decide what research should and should not be done. Engaging with the public by creating a dialog, thus reinforcing the trust in research institutions was understood as an essential measure to bridge the gap between science and society. This supposition is suggested, among others, by P6’s emphasis on the fact that public communication is ‘the duty of the university’ because ‘this is a real job, it’s a real task.’ In fact, some of the interviewees revealed to be actively involved in public communication projects. However, could improved and extended public communication indeed reduce societal fears of certain scientific advances? On the one hand, the positive link between public engagement and trust has been questioned by those involved in advocating, conducting, and evaluating public engagement practice (Petts 2008; Stilgoe et al. 2014). On the other hand, the lay press often sensationalizes scientific results in order to gain the readers’ interest, and in some cases, might lack the adequate scientific expertise.8 In addition, the ‘publish or perish’ culture in scientific research motivates some researchers to conduct and communicate critical research projects without careful reflection of the consequences. Many interviewees referred to this problem by suggesting, correctly, that Ron Fouchier, the corresponding author of one of the engineered H5N1 flu virus publications, had sensationalized his research results in order to gain attention for his research (Enserink 2011).9 The ‘publish and perish’ culture in science was also a major point with regard to the question whether individual scientists are responsible for the misuse of their research results. While many interviewees did not consider individual scientists responsible for future misuse of their research results, mostly due to the high number of uncontrollable variables affecting the reuse of information by third parties, some researchers recognized their responsibility and rated it as important. Interviewees’ conflicting attitudes toward responsibility indicate a tacit controversy among experts about the degree of responsibility and accountability of individual researchers. The controversial nature is further underscored by the fact that some interviewees, who did not consider themselves responsible for the future use of research results, also maintained that individual scientists usually do not have sufficient awareness of their social responsibility, and criticized that some scientists might be driven by selfish motives. The topic of responsibility was a major trigger of the interviewees’ interest in ethics and ethics training, which appeared frequently missing. While some interviewees argued that ethics training would unlikely lead to a change in behavior, many favored the inclusion of ethical training into the educational curriculum of scientists working in DURC in order to increase awareness of the issue. In summary, interviewees of our study addressed various key issues regarding the problem of DURC. First, while underscoring the importance of generating scientific knowledge, they also identified inherent risks in research, including misuse by third parties. In order to minimize unintended risks and collateral harms in DURC, interviewees suggested developing and improving biosafety approaches, such as vaccinations against newly created viruses. They also suggested implementing more rigorous evaluative criteria for demarcating and prioritizing truly beneficial research, although they acknowledged the difficulty of such demarcation. Generally, a majority of interviewees saw the need to regulate DURC, preferably through self-regulatory measures. Nevertheless, many were in favor of an independent advisory board on a national level. Interviewees also argued that researchers need to be aware of DURC issues and self-reflect about their research projects. Lastly, interviewees saw a need to improve the way research results are communicated to the public. These results are consistent with the findings and normative suggestions of previous studies on DURC. For example, Imperiale and Casadevall have summarized the main issues of the current debate on DURC and proposed a path forward (Imperiale and Casadevall 2015). Their suggestions include the acknowledgment of the inherent risks in research, the development of new biosafety approaches, and the creation of a national board to vet issues related to research with dangerous pathogens. The need to create a national advisory board has also been voiced by the German Ethics Council in a recent report on DURC, as well as by the ESV (Palu 2014). In line with suggestions made by our interviewees, the German Ethics Council additionally recommended to increase awareness of DURC, for example, through ethics teaching (German Ethics Council 2014). The inclusion of biosecurity sections in publications and funding applications, as suggested by our interviewees, was recommended by Gronvall as another possibility to increase awareness as well as self-reflection (Gronvall 2013; German Ethics Council 2014). Our results, thus, suggest that at least part of the life science community has reflected about DURC and sees a need for action. They also suggest that involving scientists in the debate is essential for identifying options to tackle the dual-use problem. Further participation of other groups, such as ethicists, security experts, and representatives from the society in the debate will ensure that viable options are chosen, and may increase the acceptance of regulatory measures by the scientific community. 5. Limitations This study has several limitations. While the use of a qualitative method allowed exploring a multifaceted topic in depth, due to the qualitative design, statistically representative conclusions cannot be drawn. Furthermore, the study sample may not have represented the full range of scientists’ views from the field of microbiology and virology on this topic, since it was limited in regards to sample size, as well as geographical and cultural variation. However, since the interviewees came from various countries (see methods section) and worked at international world-renowned institutions and in laboratories with outstanding international reputation, we believe that we have gained valuable insights into the views and perspectives of leading life scientists. In addition, selection biases might have affected the recruitment process. The study was announced under the title of ‘the attitudes, views and opinions of Swiss researchers in the life sciences, working with pathogens, towards biosafety and biosecurity issues’ and could have attracted scientists who have previously reflected about the topic of DURC. In addition, it could have stimulated interviewees to reflect on the topic before the actual interview. However, we are convinced that even despite these limitations the obtained findings already show a variety of well-differentiated attitudes which add significant knowledge about how scientists perceive DURC. Further research is required to compare the views, perspectives, and attitudes on DURC of researchers not only in the context of GoF but also among researchers working in other dual-use sensitive branches of the life sciences such as biochemistry, mycology, phytopathology, toxicology, neuroscience, synthetic biology, and genetic engineering (e.g. CRISPR/cas9).10 In fact, having these fields of research with disruptive potential in the upcoming decades, their biosecurity implications need to be urgently investigated and assessed. 6. Recommendations Our research results show that there is a need for raising awareness and opening a public debate among researchers on the dual-use implications of their research, even in the fields of virology and microbiology. In fact, although interviewees showed a basic familiarity with the topics of biosecurity and dual-use, yet their views appeared rarely self-reflective of their own work and far from being consolidated into a robust and coherent ethical framework. This absence of self-reflectiveness and thorough ethical analysis is problematic for a twofold reason. First, it fails to provide researchers with well-reasoned normative considerations and operative codes of conduct in their everyday research. Second, it fails to provide researchers with a comprehensive analysis of the dual-use problem not only in the context of their field of research, but also in the context of research conducted in other fields of science, using different (bio)technologies, or in the light of future research and technological applications. Findings from various areas of science have proven that dual-use concerns do not solely pertain to research with pathogens but also apply to other branches of the life sciences such as synthetic biology, biochemistry, cell biology, mycology, phytopathology, toxicology, neuroscience, etc. (Atlas and Dando 2006; McLeish and Nightingale 2007; Bennett et al. 2009; Suk et al. 2011; Tennison and Moreno 2012; Ienca and Haselager 2016). Raising awareness on these issues in various contexts and providing researchers with a comprehensive framework on the normative and practical implications of DURC and biosecurity would facilitate the development of highly generalizable and multidisciplinary perspectives on DURC. Such perspectives are particularly important not only for recognizing cross-disciplinary preventive strategies and safeguards, but also for anticipating future concerns associated to emerging technologies with multiple applications (e.g. CRISPR/cas9). In order to produce pervasive and long-lasting effects on the scientific community and society at large, this process of raising awareness on DURC and biosecurity should be included into the educational program of students at the graduate and undergraduate level as well as into the practical lab training of young researchers (together with the teaching of biosafety). For example, biosecurity-focused courses could be incorporated into the standard educational curricula of students and young researchers in the life sciences as part of their courses on bioethics, research ethics, scientific methodology, etc. This is consistent with the Article 23 of the Universal Declaration on Bioethics and Human Rights, where it is suggested that ‘States should endeavor to foster bioethics education and training at all levels’ in order ‘to achieve a better understanding of the ethical implications of scientific and technological developments, in particular for young people’ (UNESCO 2005).11 Bioethics and biosecurity education could be implemented in various ways. A promising example is the University of Bradford’s free available online ‘Educational Module Resource’ (EMR) to assist university-level lecturers to incorporate ‘biosecurity and dual-use issues into their life science courses at a higher education level’. Researchers at the Landau Network-Centro Volta (LNCV) have even advanced the idea of ‘compulsory biosecurity education’, not only with the purpose of preventing and combating ‘present and future threats’ but also for ‘helping relevant actors to fulfil their legal, regulatory and professional obligations and ethical principles’ (Minehata et al. 2013). In practice, the implementation of such educational programs often meets well-rooted obstacles inherent in the reform of academic curricula. These include the lack of space in existing curricula, the lack of time and resources available to institutions to develop new curricula, as well as limitations peculiar of biosecurity education such as the absence of expertise and available literature on biosecurity education, and the ‘general doubt and skepticism about the need for biosecurity education on the part of educators and scientists’ (Mancini and Revill 2008; Minehata and Shinomiya, 2010). However, there are promising examples of universities having successfully included biosecurity modules into life science education (Engel-Glatter et al. 2016; Maksymovych et al. 2015). The process of implementing biosecurity modules into life science curricula may be facilitated through the inclusion of biosecurity material into core life science textbooks such as ‘Molecular Biology of the Cell’ (Alberts et al. 2015). In addition, based on the views of interviewees, we believe that awareness could and should be raised through the weighted use of regulatory interventions aimed at maximizing biosecurity without thereby harming the freedom of researcher. The hypothesis of self-regulation by the scientific community and its institutions was highly welcomed by the researchers involved in our study. It should be part of the regulatory approach, not only as a pragmatic measure for reducing administrative work, but also as an instrument to raise social responsibility among researchers. For example, as suggested by one interviewee, a biosecurity section could be incorporated into the standard templates of research project submissions to funding agencies in a similar manner as other research ethics requirement such ‘conflict of interest’, ‘authorship declaration’, ‘re-use of data’, etc. It could include detailed explanations of why the work should be undertaken despite the potential risks, as well as detailed information on the safety precautions (Gronvall 2013). In particular, a protocol for the disclosure of DURC similar to current protocols for the disclosure of conflict of interest might a cheap, easy-to-implement, and effective solution. Such institutional regulatory interventions could occur at the level of funding agencies, research centers, and publishers and have already been implemented in the UK by the Biotechnology and Biological Sciences Research Council (BBSRC), the Medical Research Council (MRC), and the Wellcome Trust.12 However, there is a need to elaborate which kind of knowledge and expertise is essential to interpret and evaluate these sections. However, relying on self-regulation only will most likely not result in the implementation of measures effectively addressing the dual-use problem. Our interviews confirm that the dual-use topic creates a conflict between scientific freedom and promotion of security (Miller and Selgelid 2007; Selgelid 2009). The scientific community has little interest in delaying or even stopping potentially successful biomedical research in the light of biosecurity threats. The success of any other regulatory model will depend on the active participation and support of the scientific community. In 2014, Smith and Kamradt-Scott (2014) suggested tying funding to oversight and using a multidisciplinary approach for reviewing dual-use research. This model, despite concerns raised about its implementation, was welcomed by many of our interviewees. There are many obstacles impeding the successful implementation of such a regulatory model (Ehrlich 2014; Lev and Samimian-Darash 2014). However, with the combined effort of all stakeholders, including the scientific community, the security community, and government officials, it may be a viable option for regulating DURC (Jacobsen et al. 2014). Finally, we are aware that our study results are not generalizable; nevertheless we believe that our results might light a passable path forward. 7. Conclusion Since the recent re-ignition of the debate on DURC, issues of biosafety and biosecurity have been widely discussed in the literature on microbiology/virology. Yet, a high degree of uncertainty remains with regard to the problems of risk-assessment, prevention, and on the implementation of regulatory measures. To enrich the understanding of these problems, facilitate the debate on the topic of dual-use, and elicit effective preventive measures and appropriate regulatory interventions, we have investigated the views and attitudes on DURC in the life sciences of international life scientists in Switzerland. In contrast to previous studies, which reported a limited awareness towards DURC among researchers, our results indicate that scientists are generally aware of the problem, but often lack the self-reflective attitude necessary for identifying and assessing dual-use aspects related to their own work. The interviews also show that, although freedom of research is widely considered a non-negotiable value, most researchers are supportive of some form of regulation of dual-use research, including via external advisory boards. Linking empirical data with existing normative literature, we provided recommendations for raising awareness on DURC in the life sciences among researchers and the public, for improving risk-assessment among individual scientists, and for the selection of preventive strategies. These include creating a solid and generalizable theoretical framework on DURC, updating educational curricula in the life sciences as to include or expand bioethics and biosecurity training, and pondering a weighted use of regulatory interventions aimed at maximizing biosecurity without thereby harming the freedom of research. In implementing such strategies, a cooperative effort should be pursued in a manner that actively involves scientists, ethicists, security experts, and regulatory agencies on an equal footing. Funding This study was funded by the Forschungsfonds of the University of Basel and the Käthe-Zingg-Schwichtenberg Fonds of the Swiss Academy of Medical Sciences. Footnotes 1 Potential risks and benefits associated with Highly Pathogenic Asian Avian Influenza A virus (HPAI) Hemagglutinin 5 Neuraminidase 1 (H5N1) Gain-of-Function (GoF) proposals with strains that are transmissible among mammals by respiratory droplets are reviewed on a case-by-case basis by the funding agency and the US Department of Health and Human Services (HHS). According to the framework, such proposals are only acceptable if certain criteria are met. For more information please see <http://osp.od.nih.gov/sites/default/files/resources/NSABB_Framework_for_Risk_and_Benefit_Assessments_of_GOF_Research-APPROVED.pdf> accessed 29 Apr 2017. 2 In 2004, the US National Academies published the highly influential report ‘Biotechnology Research in an Age of Terrorism’ in an attempt to address concerns over life science research and to find a formal definition of dual-use research (National Research Council 2004). The committee was headed by Professor Gerald R. Fink, Professor of Genetics, Whitehead Institute, MIT, Boston, MA. 3 Audacity software is a free audio editor and recorder, more information available on the website <http://audacity.sourceforge.net/ about/> accessed 29 Apr 2017. 4 Biosecurity can be defined as: ‘… the sum of risk management practices in defense against biological threats’, which includes aversion of biological terrorism and other disease breakouts’ (Meyerson and Reaser 2002). 5 The term tacit knowledge was first coined by Polanyi who defined it as: ‘…things that we know but cannot tell.’ Polanyi (1962). In the context of the dual-use problem the role of tacit knowledge has been discussed by Engel-Glatter (2014). 6 CRISPR/cas9 stands for Clustered Regularly Interspaced Short Palindromic Repeats/CRISPR-associated protein 9, a genome editing technique which allows for targeted deletion and insertion of genes. 7 The project discovered strains of a certain pathogen that lacked the markers essential for detection with one of several possible detection technologies. The pathogen is listed on the select agents and toxins list <http://www.selectagents.gov/SelectAgentsandToxinsList.html> accessed 29 Apr 2017. During the interview, the interviewee explained that the intention of the research was not to provide a roadmap for terrorists, but to show the limitations of a particular detection technology. 8 See the blog of virology Professor Vincent Racaniello <http://www.virology.ws/2013/01/24/headline-writers-please-take-a-virology-course> accessed 29 Apr 2017. 9 The topic is explored by Professor Vincent Racaniello <http://www.virology.ws/2012/03/01/influenza-h5n1-is-not-lethal-in-ferrets-after-airborne-transmission> accessed 29 Apr 2017. 10 The CRISPR/cas9 gene-editing technology provoked intense debates about whether its application in some fields is ethical. These include gene editing of human embryos (Baltimore et al. 2015), gene editing of insects and their unintended release into the environment (Akbari et al. 2015), as well as the possibility that the technology could be applied by people with malevolent intentions to alter and release dangerous pathogens. 11 United Nations Educational, Scientific and Cultural Organization (2005) ‘Universal Declaration on Bioethics and Human Rights’ <http://portal.unesco.org/en/ev.php-URL_ID=31058&URL_DO=DO_TOPIC&URL_SECTION=201.html> accessed 29 Apr 2017. 12 The joint policy is available at < https://wellcome.ac.uk/funding/managing-grant/managing-risks-research-misuse> accessed 26 Apr 2017. Acknowledgements We thank Marianne Weber and Elisabeth Brunner for their help with the transcriptions. We also would like to thank all researchers who participated in the study for their time and their willingness to share their thoughts on the issue. Lastly, we would like to thank Prof. Bernice S. Elger for her support. References Akbari O. S., Bellen H. J., Bier E. et al.   ( 2015) ‘ BIOSAFETY. Safeguarding Gene Drive Experiments in the Laboratory’, Science , 349/ 6251: 927– 9. Google Scholar CrossRef Search ADS PubMed  Alberts B., Johnson A., Lewis J. et al.   ( 2015) Molecular Biology of the Cell, 6th edn. New York: Garland Science. Atlas R. M., Dando M. ( 2006) ‘ The Dual-Use Dilemma for the Life Sciences: Perspectives, Conundrums, and Global Solutions’, Biosecurity and Bioterrorism-Biodefense Strategy Practice and Science , 4/ 3: 276– 86. Google Scholar CrossRef Search ADS   Baltimore D., Berg P., Botchan M. ( 2015) ‘ Biotechnology. A Prudent Path Forward for Genomic Engineering and Germline Gene Modification’, Science , 348/ 6230: 36– 8. Google Scholar CrossRef Search ADS PubMed  Bennett G., Gilman N., Stavrianakis A. et al. ( 2009) ‘ From Synthetic Biology to Biohacking: Are We Prepared?’, Nature Biotechnology , 27/ 12: 1109– 11. Google Scholar CrossRef Search ADS PubMed  Casadevall A., Shenk T. ( 2012) ‘ Mammalian-Transmissible H5n1 Virus: Containment Level and Case Fatality Ratio’, MBio , 3/ 2: e00054– 12. Google Scholar CrossRef Search ADS PubMed  Clapper J. R. ( 2016), ‘Statement for the Record Worldwide Threat Assessment of the US Intelligence Community, Director of National Intelligence’, Washington DC < www.dni.gov/files/documents/SASC_Unclassified_2016_ATA_SFR_FINAL.pdf> accessed 03 May 2017. Dando M. R., Rappert B. ( 2005) ‘Codes of Conduct for the Life Sciences: Some Insights from UK Academia’, Briefing Paper No 16 (Second Series). Strengthening the Biological Weapons Convention, pp. 1– 27. Department of Peace Studies, University of Bradford <http://www.brad.ac.uk/acad/sbtwc/briefing/BP_16_2ndseries.pdf> accessed 29 Apr 2017. Ehrlich S. A. ( 2014) ‘ H5N1: A Cautionary Tale’, Frontiers in Public Health , 2: 117. Google Scholar CrossRef Search ADS PubMed  Engel-Glatter S. ( 2014) ‘ Dual-Use Research and the H5n1 Bird Flu: Is Restricting Publication the Solution to Biosecurity Issues?’, Science and Public Policy , 41/ 3: 370– 83. Google Scholar CrossRef Search ADS   Engel-Glatter S., Cabrera L. Y., Marzoukic Y. et al.   ( 2016) ‘ Teaching Bioethics to a Large Number of Biology and Pharma Students Lessons Learned’, Ethics & Behavior , 1– 21, doi: 10.1080/10508422.2016.1196361. Enserink M. ( 2011) ‘Scientists Brace for Media Storm Around Controversial Flu Studies’, ScienceInsider, pp. 1–2. American Association for the Advancement of Science , Highwire Press: Washington, DC <http://news.sciencemag.org/scienceinsider/2011/11/scientists-brace-for-media-storm.html> accessed 03 May 2017. Enserink M. ( 2013) ‘ Dual-use Research. Dutch H5N1 Ruling Raises New Questions’, Science , 342/ 6155: 178. Google Scholar CrossRef Search ADS PubMed  Fears R., Ter Meulen V. ( 2015) ‘ European Academies Advise on Gain-of-Function Studies in Influenza Virus Research’, Journal of Virology , 90/ 5: 2162– 4. Google Scholar CrossRef Search ADS PubMed  Fouchier R., Osterhaus A. B., Steinbruner J. et al.   ( 2012) ‘Preventing Pandemics: The Fight Over Flu’, Nature , 481/ 7381: 257– 9. Google Scholar CrossRef Search ADS PubMed  Frank G. M., Adalja A., Barbour A. et al.   ( 2016) ‘ Infectious Diseases Society of America and Gain-of-Function Experiments With Pathogens Having Pandemic Potential’, Journal of Infectious Diseases , 213/ 9: 1359– 61. Google Scholar CrossRef Search ADS PubMed  German Ethics Council ( 2014) Biosecurity—Freedom and Responsibility of Research. Opinion, Summary And Recommendations, pp. 1–10 . Berlin: German Ethics Council <http://www.ethikrat.org/publications/opinions/biosecurity> accessed 03 May 2017. Gronvall G. K. ( 2013) ‘Working Paper on H5N1: A Case Study for Dual-Use Research’, pp. 1–29 . New York: Council on Foreign Relations <http://www.cfr.org/public-health-threats-and-pandemics/h5n1-case-study-dual-use-research/p30711> accessed 03 May 2017. Gurwitz D. ( 2014) ‘ Gene Drives Raise Dual-Use Concerns’, Science , 345/ 6200: 1010. Google Scholar CrossRef Search ADS PubMed  Herfst S., Schrauwen E. J., Linster M. et al. ( 2012) ‘ Airborne Transmission of Influenza A/H5N1 Virus between Ferrets’, Science , 336/ 6088: 1534– 41. Google Scholar CrossRef Search ADS PubMed  Ienca M., Haselager P. ( 2016) ‘ Hacking the Brain: Brain–Computer Interfacing Technology and the Ethics of Neurosecurity’, Ethics and Information Technology , 18/ 2: 117– 29. Google Scholar CrossRef Search ADS   Imai M., Watanabe T., Hatta M. et al.   ( 2012) ‘ Experimental Adaptation of an Influenza H5 HA Confers Respiratory Droplet Transmission to a Reassortant H5 HA/H1N1 Virus in Ferrets’, Nature , 486/ 7403: 420– 8. Google Scholar PubMed  Imperiale M. J., Casadevall A. ( 2015) ‘ A New Synthesis for Dual Use Research of Concern’, PLoS Medicine , 12/ 4: e1001813. Google Scholar CrossRef Search ADS PubMed  Jackson R. J., Ramsay A. J., Christensen C. D. et al.   ( 2001) ‘ Expression of Mouse Interleukin-4 by a Recombinant Ectromelia Virus Suppresses Cytolytic Lymphocyte Responses and Overcomes Genetic Resistance to Mousepox’, Journal of Virology , 75/ 3: 1205– 10. Google Scholar CrossRef Search ADS PubMed  Jacobsen K. X., Mattison K., Heisz M. et al.   ( 2014) ‘ Biosecurity in Emerging Life Sciences Technologies, a Canadian Public Health Perspective’, Frontiers in Public Health , 2: 198. Google Scholar CrossRef Search ADS PubMed  Kelle A. ( 2007) ‘Synthetic biology & biosecurity awareness in Europe’, Bradford Science and Technology Report No. 9, pp. 1–31. Vienna: Synbiosafe <www.synbiosafe.eu/uploads/pdf/Synbiosafe-Biosecurity_awareness_in_Europe_Kelle.pdf> accessed 03 May 2017. Kennedy D. ( 2008) ‘ Science and Security, Again’, Science , 321/ 5892: 1019. Google Scholar CrossRef Search ADS PubMed  Kong W., Liu Q., Sun Y. et al.   ( 2016) ‘ Transmission and Pathogenicity of Novel Reassortants Derived from Eurasian Avian-Like and 2009 Pandemic H1N1 Influenza Viruses in Mice and Guinea Pigs’, Scientific Reports , 6: 27067. Google Scholar CrossRef Search ADS PubMed  Lev O., Samimian-Darash L. ( 2014) ‘ Biosecurity Policy in the US: A Critical Assessment’, Frontiers in Public Health , 2: 110. Google Scholar CrossRef Search ADS PubMed  Lipsitch M., Galvani A. P. ( 2014) ‘ Ethical Alternatives to Experiments with Novel Potential Pandemic Pathogens’, PLoS Medicine , 11/ 5: e1001646. Google Scholar CrossRef Search ADS PubMed  Maksymovych I. S., Gergalova G. L., Komisarenko S. V. ( 2015) ‘ Some International Projects on Increasing Knowledge in Biosafety and Biosecurity: Efforts in Ukraine’, Journal for Veterinary Medicine, Biotechnology and Biosafety , 1/ 1: 39– 43. Mancini G., Revill J. ( 2008) ‘Fostering the Biosecurity Norms: Biosecurity Education for the next Generation of Scientists’ . Dando M., Martellini M. (eds.), pp. 1- 28. Bradford Disarmament Research Center (BDRC), University of Bradford, UK; Landau Network-Centro Volta (LNCV), Como, Italy, <sro.sussex.ac.uk/39517/1/Fostering.pdf> accessed 03 May 2017. McLeish C., Nightingale P. ( 2005) ‘The Impact of Dual Use Controls on UK Science: Results from a Pilot Study’, Science and Technology Policy Research (SPRU) . University of Sussex <http://www.sussex.ac.uk/spru/documents/sewp132.pdf> accessed 03 May 2017. McLeish C., Nightingale P. ( 2007) ‘ Biosecurity, Bioterrorism and the Governance of Science: The Increasing Convergence of Science and Security Policy’, Research Policy , 36/ 10: 1635– 54. Google Scholar CrossRef Search ADS   Meyerson L. A., Reaser J. K. ( 2002) ‘ Biosecurity: Moving toward a Comprehensive Approach’, Bioscience , 52/ 7: 593– 600. Google Scholar CrossRef Search ADS   Miller S., Selgelid M. J. ( 2007) ‘ Ethical and Philosophical Consideration of the Dual-use Dilemma in the Biological Sciences’, Science and Engineering Ethics , 13/ 4: 523– 80. Google Scholar CrossRef Search ADS PubMed  Minehata M., Shinomiya N. ( 2010) ‘ Japan: Obstacles, lessons and future’, In: Rappert B (ed.) Education and ethics in the life sciences: Strengthening the prohibition of biological weapons , pp. 93– 114. Canberra: Australian National University E Press <http://epress.anu.edu.au/education_ethics.html> accessed 23 Aug 2017. Minehata M., Sture J., Shinomiya N., et al.   ( 2013) ‘ Implementing Biosecurity Education: Approaches, Resources and Programmes’, Science and Engineering Ethics , 19/ 4: 1473– 86. Google Scholar CrossRef Search ADS PubMed  National Research Council ( 2004) Biotechnology Research in an Age of Terrorism, pp. 1–164 . Washington, DC: The National Academies Press <https://www.nap.edu/catalog/10827/biotechnology-research-in-an-age-of-terrorism> accessed 03 May 2017. National Research Council ( 2009) A Survey of Attitudes and Actions on Dual Use Research in the Life Sciences: A Collaborative Effort of the National Research Council and the National Association for the Advancement of Science, pp. 1–188 . Washington DC: The National Academies Press <https://www.nap.edu/catalog/12460/a-survey-of-attitudes-and-actions-on-dual-use-research-in-the-life-sciences> accessed 03 May 2017. Nature ( 2015) Special on Mutant Flu <http://www.nature.com/news/specials/mutantflu/index.html> accessed 29 April 2017. Nordmann B. D. ( 2010) ‘ Issues in Biosecurity and Biosafety’, International Journal of Antimicrobial Agents , 36 (Suppl 1): S66– 9. Google Scholar CrossRef Search ADS PubMed  NSABB ( 2007) Proposed Framework for the Oversight of Dual Use Life Sciences Research: Strategies for Minimizing the Potential Misuse of Research Information, pp. 1–55. National Research Council <http://osp.od.nih.gov/office-biotechnology-activities/nsabb-reports-and-recommendations/proposed-framework-oversight-dual-use-life-sciences-research> accessed 03 May 2017. Palu G. ( 2014) ‘ Regulating Dual-Use Research in Europe’, Science , 343/ 6169: 368– 9. Google Scholar CrossRef Search ADS PubMed  Petts J. ( 2008) ‘ Public Engagement to Build Trust: False Hopes?’, Journal of Risk Research , 11/ 6: 821– 35. Google Scholar CrossRef Search ADS   Polanyi M. ( 1962) ‘ Tacit Knowing—Its Bearing on Some Problems of Philosophy’, Reviews of Modern Physics , 34/ 4: 601– 16. Google Scholar CrossRef Search ADS   Reardon S. ( 2014) ‘Viral research moratorium called too broad’. Nature News, 23rd October 2014  <http://www.nature.com/news/viral-research-moratorium-called-too-broad-1.16211> accessed 23 Aug 2017. Schweizerische Eidgenossenschaft (Fassung vom 01 June 2015) Verordnung über den Umgang mit Organismen in geschlossenen Systemen. Bern. <https://www.admin.ch/opc/de/classified-compilation/20100803/index.html> accessed 03 May 2017. Selgelid M. J. ( 2009) ‘ Governance of Dual-Use Research: An Ethical Dilemma’, Bulletin of the World Health Organization , 87/ 9: 720– 3. Google Scholar CrossRef Search ADS PubMed  Smith F. L.3rd, Kamradt-Scott A. ( 2014) ‘ Antipodal Biosecurity? Oversight of Dual Use Research in the United States and Australia’, Frontiers in Public Health , 2: 142. Google Scholar PubMed  Stilgoe J., Lock S. J., Wilsdon J. ( 2014) ‘ Why Should We Promote Public Engagement with Science?’, Public Understanding of Science , 23/ 1: 4– 15. Google Scholar CrossRef Search ADS PubMed  Suk J. E., Zmorzynska A., Hunger I. et al.   ( 2011) ‘ Dual-Use Research and Technological Diffusion: Reconsidering the Bioterrorism Threat Spectrum’, PLoS Pathogen , 7/ 1: e1001253. Google Scholar CrossRef Search ADS   Sutton T. C., Finch C., Shao H. et al.   ( 2014) ‘ Airborne Transmission of Highly Pathogenic H7N1 Influenza Virus in Ferrets’, Journal of Virology , 88/ 12: 6623– 35. Google Scholar CrossRef Search ADS PubMed  Tennison M. N., Moreno J. D. ( 2012) ‘ Neuroscience, Ethics, and National Security: The State of the Art’, PLoS Biology , 10/ 3: e1001289. Google Scholar CrossRef Search ADS PubMed  The White House Office of Science and Technology Policy ( 2014) ‘Doing diligence to assess the risks and benefits of life sciences gain-of-function research’ <https://www.whitehouse.gov/blog/2014/10/17/doing-diligence-assess-risks-and-benefits-life-sciences-gain-function-research> accessed 29 April 2017. Wei K., Sun H., Sun Z. et al.   ( 2014) ‘Influenza A Virus Acquires Enhanced Pathogenicity and Transmissibility after Serial Passages in Swine’, Journal of Virology , 88/ 20: 11981– 94. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Science and Public Policy Oxford University Press

Life scientists’ views and perspectives on the regulation of dual-use research of concern

Loading next page...
1
 
/lp/ou_press/life-scientists-views-and-perspectives-on-the-regulation-of-dual-use-q7YZjKeRTT

References (58)

Publisher
Oxford University Press
Copyright
© The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
ISSN
0302-3427
eISSN
1471-5430
DOI
10.1093/scipol/scx050
Publisher site
See Article on Publisher Site

Abstract

Abstract In the early 2000s, several publications initiated a debate about the potential misuse of academic life science research. The debate was refueled in 2012, when two studies describing the engineering of human transmissive H5N1 bird flu virus were published. To facilitate the debate on dual-use research of concern (DURC) and regulatory measures, we interviewed life science researchers working in Switzerland about their views on DURC. The results indicate that all scientists interviewed were aware of the debate, however, few had reflected about dual-use aspects with regard to their own work. Although all respondents believed in freedom of research, a majority was supportive of some form of regulation of DURC. This article discusses the major implications of the study, especially regarding the implementation of regulatory measures. In addition, preliminary recommendations are given for raising awareness on DURC in the life sciences among researchers. 1. Introduction In 2012, the publication of two research articles on the creation of airborne human transmissible bird flu viruses (Herfst et al. 2012; Imai et al. 2012) has revived an intense debate on the misuse potential of life science research (Nature 2015). During a year-long, self-imposed moratorium, various actors including the scientific community, ethicists, security experts, government officials, the public, and international organizations discussed whether gain-of-function (GoF) research on highly pathogenic avian influenza should be conducted and published (Casadevall and Shenk 2012). While the moratorium has been lifted, the debate is ongoing. Its focus, meanwhile, expanded from concerns about the potential misuse of research results by malevolent actors (biosecurity issues) to concerns about the risks of an accidental release of engineered pathogens (biosafety issues) (Imperiale and Casadevall 2015). As a result of these events, the US government has implemented a funding pause on certain types of GoF research in order to assess potential risks and benefits of such research (The White House Office of Science and Technology Policy 2014). The ban covers GoF studies that enhance the pathogenicity or transmissibility among mammals by respiratory droplets of influenza, Middle East Respiratory Syndrome (MERS) virus or severe acute respiratory syndrome (SARS) virus. While supported by some scientists (Lipsitch and Galvani 2014), the ban has provoked an outcry by others who fear scientific setbacks and potential negative consequences for public health and pandemic preparedness (Reardon 2014) . While the pause is still in effect and how to best address the threat is still a matter of debate, new GoF studies are being published on a regular basis (Sutton et al. 2014; Wei et al. 2014; Kong et al. 2016). In 2013, the US Department of Health and Human Services (HHS) established a framework on dual-use research of concern (DURC),1 which requires a case-by-case review of Highly Pathogenic Asian Avian Influenza A virus (HPAI) Hemagglutinin 5 Neuraminidase 1 (H5N1) GoF proposals submitted to US funding agencies. Other countries have so far left the regulation of the dual-use problem entirely in the hands of the scientific community. Scientific societies, funding organizations, journals and research institutions worldwide have released statements, established procedures, and published code of conducts (or ethics) in order to address DURC and GoF, including the European Academies Science Advisory Council (EASAC) (Fears and Ter Meulen 2015). Despite commendable initiatives such as the dual-use guidelines for EU funding, the absence of a general framework at EU level created practical problems: the Dutch government applied export control law for dual-use items (Council Regulation 428/2009) in 2012 in order to control the release of the H5N1 paper submitted for publication by the Dutch researcher Ron Fouchier (Enserink 2013). The law was originally designed to prevent the proliferation of nuclear, biological, and chemical weapons, thus banning the shipping of goods, pathogens, and technologies that could be used for nefarious purposes. It was the first time it was used to control the publication of a scientific paper. The use of export controls for this purpose has been rejected by experts in nonproliferation and by scientific experts alike as unfeasible (Enserink 2013; Palu 2014). In order to tackle the dual-use problem, researchers need to articulate and debate their views about what risks are acceptable and what research is truly needed. However, as European Society for Virology (ESV) and American Society for Microbiology (ASM) representatives recently pledged, there is also a need for increased stakeholder involvement beyond the scientific community, especially among ethicists, biosecurity experts, and public health representatives (Palu 2014; Imperiale and Casadevall 2015). There is a wide agreement that those scientists associated with controversial research cannot and should not be left alone with the decision whether to publish or conduct such research (Fears and Ter Meulen 2015; Imperiale and Casadevall 2015; Frank et al. 2016). This is especially important since previous studies with life science researchers—conducted before the publication of the H5N1 studies—have shown that most participating researchers were not even aware of the current debates and concerns about dual-use research (Dando and Rappert 2005; Kelle 2007; National Research Council 2009). Moreover, a qualitative study predominantly conducted at UK institutions revealed that most researchers did not regard bioterrorism or bioweapons as a substantial threat and did not believe in the contribution of life science developments to the problem of biological weapons and bioterrorism (Dando and Rappert 2005). A quantitative study led by the US National Research Council (National Research Council 2009) showed that most researchers did not regard their own research as dual-use even if it was categorized as ‘experiments of concern’ as described in the so called ‘Fink report’2. When asked about governance options, a majority of life scientists did not support regulations to control DURC (Dando and Rappert 2005; National Research Council 2009). However, a study investigating the impact of dual-use controls amongst UK life scientists working with agents listed in Schedule 5 of the 2001 Anti-Terrorism, Crime, and Security Act found that a majority of interviewees viewed regulation of DURC as worth considering, particularly through increased screening of personnel working with dangerous pathogens (McLeish and Nightingale 2005). The same study suggests that, at least in the UK, part of the scientific community was prepared to expend considerable effort on improving risk management with regards to DURC. In fact, in order to effectively implement regulatory measures, it is essential that researchers are aware of DURC issues and supportive of regulations. Thus, to get an insight into life scientists’ awareness, views, and perspectives on DURC in the aftermath of the H5N1 cases, we have interviewed international life scientists working within the fields of microbiology and virology/cell biology. In contrast to previous studies, all interviewees of our study were aware of the DURC debate. This article presents a summary of the main points raised and discussed during the interviews, including the types of regulation supported by the interviewees, the interviewees’ thoughts about the likelihood of misuse and feasibility of replicating sensitive research results, as well as their opinions about the publication and communication of sensitive research. Subsequently, we discuss the interviewees’ personal judgment about the extent to which individual scientists can be made accountable for actual misuse of research. Finally, we connect our results to the wider debate on DURC, discuss the major implications of our research results, and provide some preliminary recommendations for raising awareness on DURC in the life sciences through open debates, specific educational programs, and weighted regulatory interventions. 2. Methods We conducted and analyzed nineteen qualitative interviews with life scientists working in Switzerland. All scientists interviewed were actively working within the fields of microbiology or virology/cell biology. Purposive sampling was adopted in order to obtain a diverse selection of scientists from a selection of academic institutions (universities and other research institutions with microbiology/virology research groups) with diverse gender, age, nationality, and professional experience characteristics (PhD students, postdoctoral researchers, group leaders, and full professors). A total of twenty researchers were purposively selected from the homepages of the research institutions according to their research profile. Participants were contacted via e-mail outlining the research. The email contained the following information: (1) title of the study: ‘Attitudes, views and opinions of Swiss researchers in the life sciences working with pathogens towards biosafety and biosecurity issues’, (2) the foundation which financed the study, (3) the approximate length of the interviews, (4) anonymization of the interview content, as well as (5) the invitation to participate. During approximately one-hour long face-to-face interviews conducted at their institutions or at neutral meeting places, participants answered questions about biosecurity and biosafety issues associated with dual-use research in the life sciences. Besides the interviewers and the interviewee nobody else was present during the interview. The interviews were recorded in the time period between October 2012 and February 2013 using the Audacity software.3 In addition to sets of questions about their knowledge and attitudes about dual-use research, interviewees were given the details of two case studies: the H5N1 engineered bird flu virus and the mouse pox virus ( Jackson et al. 2001; Herfst et al. 2012; Imai et al. 2012), and invited to share their opinions about such case studies. Additional questions explored the issue of regulation, in particular, the problem of whether and in what form dual-use research should be regulated. The interview guideline was pilot-tested and was adapted during the first interviews. The interviews were conducted by the first author (S. E-G.). Interviews were transcribed verbatim in the original language spoken during the interviews (English, German, and Swiss German dialects) and were analyzed with the support of the analysis program Atlas.ti, Version 7.0. Participants were given the opportunity to review their interview transcripts. However, no participant made use of this option. A repetition of one or more interviews was not necessary. To improve comprehensibility, the language and grammar of the quotes used in the article were slightly adapted while preserving the original message. 3. Results Our analysis of the interviews with researchers in Switzerland identified four main recurrent themes related to the following questions: (1) Whether and how dual-use research may be regulated, (2) What is the probability and actual feasibility of misusing research results, (3) How research results should be communicated (to peers and public), (4) Whether researchers act responsibly or can be taught to do so. Conceptually, question 1 is linked to the issues of biosecurity and regulation; question 2 to the issues of misuse and replicability; question 3 is to the issues of publication, communication, and public opinion. Finally, question 4 regards the notion of responsibility and the role of ethics. Each of these core themes was further analyzed in detail. Although our interview study is limited in regards to sample size and generalizability, we believe that it provides valuable insights into the perspectives and views of the life science community. 3.1 Biosecurity At the conceptual level, the notion of biosecurity was confused with or conflated upon the notion of biosafety on more than one occasion. However, when the notion of biosecurity was explained or became intelligible to the interviewee, the relevance and significance of the debate over biosecurity and dual-use was immediately recognized by most researchers P1: ‘We have to discuss biosecurity; that is quite clear.’ While there was wide agreement about the necessity to discuss biosecurity issues, almost all researchers did not consider biosecurity4 aspects regarding their own work. The main reasons given by the interviewees for not having considered biosecurity issues were the low pathogenicity/virulence of the pathogen used in their research, its host spectrum (non-human pathogen), and the kind of research conducted (pathogen–host interactions, as well as characterization of genetic factors instead of targeted genetic engineering of the pathogen). One interviewee pointed out that the discussion was centered on the wrong issue. According to this person, biosafety issues are much more important and need to be discussed primarily when manipulating or creating human pathogens. This interviewee claimed that the threat of misuse is infinitely smaller compared to the potential of an accidental release of such a pathogen. 3.2 Regulation Regarding the issue of regulating research with possible dual-use potential, some interviewees questioned whether regulation per se can be considered a viable way of preventing risks. The practical impossibility of implementing and enforcing fair and functional regulations were given as main reason. For example, it was argued that it would be difficult to identify risky research and that such attempts could possibly result in an unfair blocking of someone’s research. Another issue raised by interviewees was the lack of means for sanctioning and thus stopping unwanted research. P10: ‘Are universities prepared to sack a professor as a consequence? What other options for sanctioning are there? I can cut their funding, however, that equals sacking. Do I want to lose the expert that may receive the Nobel Prize one day? … The idea [of regulation] is good; I just have doubts about its implementation ….’ Another common argument was that, in order to regulate efficiently, it would be necessary to implement similar regulations worldwide, which was considered difficult or even impossible. P10: ‘Science is dynamic, I mean, if one implements a strong regulation in one country prohibiting certain things, then one cannot prevent such research in other countries, which did not implement such regulation. I imagine it to be difficult to introduce a world-wide ban on certain research.’ Nevertheless, few interviewees argued in favor of no-regulation scenarios, for example, with risk-management being entirely in the hands of the individual researcher. The majority of interviewees recognized the value of regulation (see Figure 1). Their opinions, however, diverged with regard to the regulatory framework to adopt and the regulatory bodies entitled to enforce such regulation. Figure 1. View largeDownload slide Options for the regulation of DURC. The majority of interviewees (17 out of 19) indicated to be generally in favor of regulation. Of those more than half provided details on the preferred form of regulation. Figure 1. View largeDownload slide Options for the regulation of DURC. The majority of interviewees (17 out of 19) indicated to be generally in favor of regulation. Of those more than half provided details on the preferred form of regulation. The following regulatory frameworks were proposed: (1) case-by-case or (2) through general rules, implemented at different stages of the project, for example, at the (1) funding- and/or (2) publication-stage. P12: ‘I think that [regulation] also has to be put in place at the beginning of a research project. Because a grant proposal could turn into nothing, could be a complete failure, or could blow up into something like this. And I think, yes, that’s why it is the responsibility of the funding bodies. And I think, the journals …, I think it's good that they are there.’ While some interviewees argued in favor of a case-by-case framework with no general categorizations to allow for greater flexibility in research, others were in favor of general frameworks with e.g. general rules, codes of conduct, and definitions of sensitive research. Responses diverged with regard to the regulatory bodies entitled to enforce such regulations. Most researchers preferred self-regulation through institutions of the scientific community. The following options were advanced and discussed (1) Independent advisory body, (2) National funding agency, (3) Institute/University, (4) Government, and (5) Journals. One researcher, who argued for control on the level of publication, said P11: ‘There has to be some kind of agreement between publishers, there has to be some kind of general rule which applies to everybody. For example, … there has to be a clear definition of what is classified, … and there has to be some independent body that receives this kind of classified (information) from every journal. And they should decide (about who can access classified information). There has to be some independent body, independent of the journals, there has to be one superior level.’ Most importantly, regulation was seen in possible conflict with the principle of freedom of research. The activity of regulatory bodies, as recurrently addressed, should not limit or temper scientific freedom and scientific progress, but simply prevent significant risks of misuse. Science, many interviewees argued, cannot be feasibly stopped and should not be stopped, since the individual and collective freedom of scientists is a crucial determinant of scientific innovation and progress: P11: ‘The scientific system should control sensitive information better, but not at the cost of the benefits. It shouldn't stop research. What I mean is, I am somebody who supports genetic engineering and such things, but with tighter regulation. You cannot stop technology. You cannot stop science. I strongly believe that you just cannot stop science. So all you can do is control and be open about how you control things.’ An important theme in this context was the irrepressible character of the scientific endeavor, both in a descriptive and normative sense, making regulation substantially more difficult or even impossible as many interviewees argued. P7: ‘I think saying this should not be done or this should not be published is very difficult. Because it's something that is said before things are done, and you never know what will come out.’ In one occasion, the clash between the golden value of scientific freedom and the need for regulation generated a moral dilemma: P15: ‘I find it very difficult. The scientist in me says we should publish everything because we want to reveal things and the goal is to understand how exactly phenomena occur …. On the other hand, this is not without danger. At the end of the day, having some kind of committee entitled to decide yes or no, I would not know how – in a fair and independent manner – how there should be such a thing ….’ In this context, social responsibility linked to systematic regulatory models was identified as a possible solution to overcome the problem of unfair and biased regulation. In several occurrences, submitting research designs or results with dual-use potential to regulatory authorities on the initiative of the researcher him/herself was seen as a form of returning social responsibility to the individual scientist. For example, one interviewee argued P2: ‘I would have no problem if the granting agency would ask me after my proposal to explain the ethical issues of my own project, because this is a way to return the responsibility to me.’ While a certain degree of regulation was generally favored, potential pitfalls associated with excessive or ineffective regulations were frequently addressed. Potential dangers associated with regulatory interventions were linked to the problem of increasing bureaucracy and seen as an administrative obstacle to the generation of innovative scientific research: P2: ‘It's a bit scary, because you add one layer of control and that means administration.’ An overview of the interviewees’ opinions on the regulation of DURC is presented in Figure 1. 3.3 Misuse The potential for misuse was seen as a ubiquitous aspect of current bioscience research, based on the premise that virtually any piece of knowledge or technology can be, in principle, co-opted for nefarious purposes by malevolent agents. P3: ‘There will always be a risk. There will always be people who want to abuse something or who have malicious intentions and want to misuse research results. This ambivalence is not under our control that’s quite clear.’ Even more generally, the potential for misuse was not limited to research results but seen as an unavoidable feature of many kinds of information or objects, for example knifes, cars, and explosives. However, interviewees’ opinions diverged with regard to the identification of potential malevolent actors as well as to the prediction and assessment of risk. Three possible types of malevolent actors were identified on a gradual scale going from (1) individual agents to (2) organized groups, up to (3) governments and states. Many interviewees evaluated the possibility of misuse through individual actors or small groups as probabilistically low based on the recurrent assumption that bioterrorist projects, with the potential to cause serious harm to the population, require large-size and highly sophisticated infrastructures. These infrastructures, many interviewees observed, due to their volume and high financial costs, cannot be built and managed by individual actors or small groups, but only by large and well-funded organized groups, terrorist networks, or governments. As one interviewee argued: P1: ‘You need some kind of infrastructure, and I think that a small group can’t go that far. I could imagine, theoretically, if you think about the worst case scenario, that a state could do this somewhere secretly.’ One interviewee pointed out that strict biosafety measures need to be implemented to launch a bioterrorist attack against humans, which, in turn, poses an infrastructural problem in terms of a sterile, lab-like working environment and high-tech equipment. P7: ‘At first they need the infrastructure. They need a lab. And they are working with pathogens that can be transmitted to humans apparently. So they can't work in a lab like here (biosafety level 1-2), otherwise they could, they would get infected, and they could also die. So the infrastructure is not something that can be obtained easily, actually.’ In addition, many interviewees observed that there are much easier and more effective alternatives than bioterrorism; alternatives with the potential to cause selective or indiscriminate harm to the population. According to this stream of thought, potential microbiology-based bioterrorist practices would hardly be worth the effort in terms of financial cost, required equipment and expertise, as well as magnitude and predictability of the outcome as compared to other forms of terrorism. For example, an interviewee argued: P10: ‘I mean, if you have a cell culture and the respective molecular biology know-how, you can do that (produce the human transmissible bird flu virus), yes. But I mean there are so many things you can do. You can even make explosive substances by yourself, you just need to go on the internet and find all information about how to do it.’ 3.4 Replicability Our results show that the practical feasibility of bioterrorist attacks is not the only perceived factor at play in biosecurity risk assessment and management. Issues of replicability are critically relevant, too. Interviewees pointed out that, besides financial investments, equipment and infrastructures, as well as socio-cognitive factors such as expertise, knowledge transfer, and education are equally important. Interviewees diverged with regard to what level of education is required to effectively perform bioterrorist activities. While some argued that a master’s degree might be sufficient to provide malevolent actors with the necessary knowledge to pose biosecurity threats, the dominant position was that a PhD degree, hence a certain level of experience in laboratory research, would be required: P3: ‘A well-educated PhD student could do it (replicate the production of human transmissible H5N1 virus), one who works in virology and knows how to do mutations, and possibly works within a project involving viral mutations and recombinant viruses.’ The element of training was repeatedly underscored by interviewees, often in conjunction with the access to some tacit knowledge5 implicitly shared among microbiology/virology researchers and inherent to their routine laboratory work. As one argued P12: ‘If you're not trained it's impossible to do what we're doing. It's not difficult to get the knowledge. I'm not saying that we are geniuses. But you need training.’ 3.5 Publication The question of how sensitive research results should be communicated was strongly linked to the issue of scientific publication. A recurrent attitude among interviewees was to underscore the importance of scientific publications for science and the generation of knowledge. This idea was repeatedly expressed both at the descriptive and the normative level, in conjunction with the highly valued principle of freedom of science. A safety-motivated or security-motivated limitation of the publication of scientific results was dominantly seen as an obliteration of the freedom of science principle. One interviewee argued: P4: ‘Research is all about difficult questions. I would personally prefer to have free research and free publication.’ The conflict between the normative principle of freedom of research and the pragmatic problem of risk assessment was observed to result in a moral dilemma tackling the essence of the entire scientific enterprise. As one interviewee reported: P4: ‘My first reaction was: this is dangerous. However, if you look at the issue from a scientific standpoint, you need to live with this knowledge. Otherwise we can no longer do science. If you put restrictions on this, I don’t know where science goes.’ Results show that the same consequentialist and pragmatic approach was prevalent when facing the issue of censoring (key details of) the methods section when publishing highly sensitive research results, as recommended back in 2012 by the US National Science Advisory Board for Biosecurity (NSABB) (Fouchier et al. 2012) for the human transmissible bird flu virus H5N1. Most interviewees opposed this decision, underscoring its practical ineffectiveness: P1: ‘Censoring the methods section is a really, really weak measure. If you can’t figure out the methods on your own, you must be quite bad’. P11: ‘You don't really need the methods. The methods can be found online. So if you know the mutations, you know the methods. You just need the ingredients, which are not impossible to get, if you just spend some time and money.’ In a few occurrences, the importance of publishing research results unrestrictedly was not only advocated based on the normative principle of freedom of science, but also on pragmatic and consequentialist grounds. For example, few interviewees underscored how ‘knowledge is the only thing that gives us protection.’ As one interviewee argued P1: ‘Knowledge is our best defense and the lack of knowledge is the biggest danger; when it comes to biosecurity the greatest problem is ignorance.’ According to this person, few scientists in the world are familiar with the details of how to weaponize pathogens. Thus, it was argued that research with certain agents should be pursued to preserve the knowledge essential to distinguish sophisticated and effective from improvised and non-effective biological weapons. According to the interviewee such distinction between different forms of weaponization could mitigate or prevent unjustified panic in cases of non-effective attacks. 3.6 Communication and public opinion With regard to the issue of communicating research results and engaging with the public, interviewees highly underscored the importance of science dissemination and the public understanding of scientific research. Some researchers suggested that this could be obtained by having scientists actively involved in public communication P6: ‘They should have scientists as a public communication partner. Universities should do that (communicate scientific outcomes), not scientists. It's the duty of the university and they should do far more there. Explain, reduce the complexity, because this is a real job, it's a real task.’ The issue of public communication was frequently linked to the issue of social responsibility in science as well to the relationship between science and society. Interviewees often implicitly assumed that society is entitled a collective right-to-know, that is, the right to be collectively and adequately informed about the latest scientific updates. Based on this right, society should be entitled to play an active role in the decision-making about biosecurity P7: … ‘As scientists it is our responsibility to inform society. I do believe that society has to know what we are doing. For me it's not a problem that a commission would say “this should be done” or “this should not be done”. It is society, actually, that should decide. But if scientists don't know what is done in other labs, they cannot tell the population or society what is going on in certain labs. So for me that's why it's very important that there is communication and transparency. In the end society should know. For me it's the most important thing that scientists should do.’ However, many interviewees suggested that it will hardly be possible to eradicate sensationalism, which was suggested to have fuelled societal fears with regards to the H5N1 publications. According to those interviewees, scientists trying to advertise their research were partly responsible for the exaggerated communication of the H5N1 research results. 3.7 Responsibility With regard to the issue of responsibility, our results reveal conflicting attitudes. On the one hand, a dominant position among interviewees was to not consider individual scientists responsible for future misuse of their research results, in the light of the high number of uncontrollable variables affecting the reuse of information by third parties: P7: ‘The people that invented cars are not responsible for the deaths of people dying in car accidents.’ On the other hand, however, the individual responsibility of scientists was recognized and, in one occasion, even described as a self-evident implication of research. As one interviewee argued P2: ‘If one day something goes wrong, it would clearly be my responsibility.’ The complexity of such conflicting attitudes was further revealed by the interviewees’ opinions related to the issue of responsibility awareness. A dominant view among interviewees was to maintain that a few scientists do not have sufficient awareness of their social responsibility, and that they are sometimes driven by selfish motives leading them to make sensational claims with the purpose of maximizing their funding opportunities and diverting public attention onto their research: P6: ‘And some people, when writing a grant, of course they say it’s the most important topic on earth, because it's the most dangerous bug on earth.’ In response to this phenomenon, many interviewees have favored the development of strategies for raising or strengthening awareness among researchers about their social responsibility, hence their duty to engage in dialogue with the public and collaboratively address biosecurity issues: P2: ‘I think every scientist should ask these questions: Why am I doing this; Am I doing only good for mankind by studying this, or could it be bad in some respects?’ As one interviewee concisely argued P4: ‘There is a need for responsible communication.’ In general, the interviewees showed a prevalent interest in ethics and the recognition of the potential positive value of ethical training when facing biosecurity issues. Hence they favored the inclusion of ethics in the educational curriculum of scientists. Some interviewees argued that ethics training will not lead to a change in behavior since the way people act depends mainly on personality. No interviewee reported to have received training in dual-use issues during their own curricular or extra-curricular education. Finally, our results also show that training in ethics, in general, was frequently missing among researchers P7: ‘In my case it was definitely something that was missing. … I think it is important for scientists to really know that what they are doing can be used in one way or can be used in another way.’ An overview of the interviewees’ opinions on the importance of including education on ethics and biosecurity issues into life science curricula is presented in Figure 2. Figure 2. View largeDownload slide Opinions on including ethics education with biosecurity aspects into life science curricula. The majority of interviewees (14 out of 19) were in favor of including ethics education into life science curricula, especially courses that discuss biosecurity issues. Two interviewees did not share their opinion on this matter. Figure 2. View largeDownload slide Opinions on including ethics education with biosecurity aspects into life science curricula. The majority of interviewees (14 out of 19) were in favor of including ethics education into life science curricula, especially courses that discuss biosecurity issues. Two interviewees did not share their opinion on this matter. 4. Discussion Our interview study, although limited in regards to sample size and generalizability, provides valuable insights into the perspectives and views of the life science community. In contrast to studies conducted before the publication of the controversial H5N1 research (Dando and Rappert 2005; Kelle 2007; National Research Council 2009), our study results show that all scientists interviewed are well-aware of the DURC debate. Interviewees indicated to have followed the media coverage of the H5N1 controversy, in part due to their professional involvement with infectious agents. However, despite their awareness of the issue, the overwhelming majority of interviewees revealed a tendency to neglect biosecurity aspects with regard to their own work, showing how self-reflective attitudes might be less common and harder to internalize than general theoretical considerations. Thus, our results indicate that more effort is needed to facilitate reflection about dual-use issues regarding the scientists’ own work. When asked about their reasons for not having considered the biosecurity implications of their own research, most interviewees explained that they were not involved in GoF research. This is formally confirmed by the fact that none of their projects involved GoF or could be identified as DURC, as classified by the NSABB (NSABB 2007). There are, however, a variety of life-science experiments and technologies with dual-use implications, for example, the novel gene-editing technology CRISPR/cas96 (Gurwitz 2014; Clapper 2016), which do not fall under the current definitions. This was noted by one interviewee who recognized the dual-use potential of his/her work during the interviews.7 One reason why researchers might not reflect about biosecurity issues could lie in the narrow formal definition of DURC provided by the NSABB. In contrast to biosecurity issues, all interviewees had reflected about biosafety issues, which may in part be explained by the tight biosafety regulations in Switzerland (Schweizerische Eidgenossenschaft, Fassung vom 2015). When discussing biosafety and biosecurity issues, it became apparent that there was a major conceptual confusion between these two notions. We presume the reason for this confusion is twofold: first, many interviewees were German native speakers and in the German language the same noun (Biosicherheit) is used to refer both to biosafety and biosecurity. Therefore, the linguistic underpinning of some interviewees might have influenced their conceptual categories. Second, the conceptual confusion of some interviewees (including non-German speakers) might reflect a degree of conceptual ambiguity inherent in the biosafety–biosecurity debate, as already observed in media reports on this topic and even in the expert debate (Nordmann 2010; Imperiale and Casadevall 2015). However, this distinction became easily intelligible to the interviewee after the interviewer’s prompts. This shows that while some researchers may not be familiar with the relevant vocabulary or may even experience conceptual confusion, they still share a fair understanding of the problematics. These results are consistent with previous studies that recognized a similar discrepancy between the semantic and the conceptual level among other speech communities, such as Japanese researchers. The significance of the debate over dual-use in relation to both, biosecurity and biosafety, was immediately recognized by almost all interviewees. At the same time, all interviewees favored freedom of research and understood it as a core value of the scientific enterprise: research should and could not be stopped. Regulation was, however, accepted as a strategy to minimize unintended risks and collateral harms in DURC. Though hard to implement, regulatory strategies aimed at minimizing risks were welcomed by many interviewees, albeit the probability of occurrence of those risks was considered reportedly low. Interviewees based their judgment on financial and infrastructural limitations as well as on the easy accessibility of simpler alternatives for terroristic attacks. This seems to suggest that, by most interviewees, regulation per se was not perceived in direct conflict to free research, providing a more mitigated picture of the tension between freedom of research and national security identified in previous literature (Kennedy 2008; Minehata and Shinomiya 2010; Suk et al. 2011). A minority of interviewees, however, rejected any kind of regulation of dual-use issues and suggested to leave the responsibility to conduct or not conduct certain research in the hands of individual researchers or research groups. For all others, the real tension was rather seen between freedom of research, on the one hand, and suboptimal forms of regulation, on the other hand. From the perspective of interviewees these forms included regulatory interventions involving disproportionate administration, excessively pervasive, and strict oversight, or the lack of scientific expertise. Even when the clash between scientific freedom and the need for regulation generated a moral dilemma (P15), such dilemma seemed to be partly resolved by the weighted and thoughtful use of regulation. As responses diverged with regard to the regulatory bodies entitled to enforce such regulations, we infer that the main issue at stake in the context of DURC regulation is not whether DURC should be regulated but how it should be regulated. The fact that most researchers preferred self-regulation through institutions of the scientific community might indicate a preference for more narrow and small-scaled regulatory strategies. Large-scaled regulatory bodies might be seen as vehicles of disproportionate administrative work and tighter oversight, besides being out of the researchers’ control. Not surprisingly, the fear of increased bureaucracy, administrative obstacles and consequent delay in project development represent recurrent themes among interviewees. The high prevalence of feasibility concerns shows how the practical dimension of regulation can hardly be decoupled from its theoretical dimension. One main concern of interviewees was to find the right (scientific) expertise in order to be able to judge the biosafety and biosecurity threats associated with the particular research. Nevertheless, many interviewees argued in favor of establishing an independent advisory board at the national level. Besides scientific experts, our interviewees suggested to include ethicists, security experts as well as representatives from the public. Doubts about the effectiveness of implemented interventions seemed to be another major reason behind the general skepticism towards regulatory measures at the level of scientific publishing. For example, the suggestion of allowing publications while censoring the methods sections was largely seen as an ineffective strategy, since methods can be easily inferred from other sources such as textbooks (Engel-Glatter 2014). From the interviewees’ perspective, accordingly, the decision of the NSABB (Fouchier et al. 2012) to publish the papers about the human transmissible bird flu virus H5N1 but omit the methods was seen as a fruitless or even counterproductive strategy. Similar outcome-oriented evaluations were advocated when discussing the issue of risk assessment. It is important to note how the degree of risk perception and risk identification may vary among different stakeholders. For example, one interviewee provided an interesting example of how the societal and even professional perception of danger regarding a bioterrorist attack with a sophisticated bioweapon may vary: by recalling the panic created by the 2001 anthrax attacks in the USA, when postal letters containing anthrax spores were delivered to politicians and news media organizations. The interviewee underscored the difficulty of weaponizing anthrax. According to the interviewee, the production of infectious anthrax spores requires high expertise in microbiology as well as specialized large-sized and, thus, expensive equipment; hence it is hardly achievable outside the military setting. This description is in opposition to the assessment of other interviewees who underscored the alleged ease and feasibility of producing and weaponizing anthrax spores. Thus, the example illustrated how the societal or even professional perception of danger may be disproportionate to the actual threat. In order to prevent ‘misplaced’ societal fears associated with DURC, many interviewees saw communication and public understanding of science as a critical factor. Some interviewees argued that it is in the hands of society to discuss and decide what research should and should not be done. Engaging with the public by creating a dialog, thus reinforcing the trust in research institutions was understood as an essential measure to bridge the gap between science and society. This supposition is suggested, among others, by P6’s emphasis on the fact that public communication is ‘the duty of the university’ because ‘this is a real job, it’s a real task.’ In fact, some of the interviewees revealed to be actively involved in public communication projects. However, could improved and extended public communication indeed reduce societal fears of certain scientific advances? On the one hand, the positive link between public engagement and trust has been questioned by those involved in advocating, conducting, and evaluating public engagement practice (Petts 2008; Stilgoe et al. 2014). On the other hand, the lay press often sensationalizes scientific results in order to gain the readers’ interest, and in some cases, might lack the adequate scientific expertise.8 In addition, the ‘publish or perish’ culture in scientific research motivates some researchers to conduct and communicate critical research projects without careful reflection of the consequences. Many interviewees referred to this problem by suggesting, correctly, that Ron Fouchier, the corresponding author of one of the engineered H5N1 flu virus publications, had sensationalized his research results in order to gain attention for his research (Enserink 2011).9 The ‘publish and perish’ culture in science was also a major point with regard to the question whether individual scientists are responsible for the misuse of their research results. While many interviewees did not consider individual scientists responsible for future misuse of their research results, mostly due to the high number of uncontrollable variables affecting the reuse of information by third parties, some researchers recognized their responsibility and rated it as important. Interviewees’ conflicting attitudes toward responsibility indicate a tacit controversy among experts about the degree of responsibility and accountability of individual researchers. The controversial nature is further underscored by the fact that some interviewees, who did not consider themselves responsible for the future use of research results, also maintained that individual scientists usually do not have sufficient awareness of their social responsibility, and criticized that some scientists might be driven by selfish motives. The topic of responsibility was a major trigger of the interviewees’ interest in ethics and ethics training, which appeared frequently missing. While some interviewees argued that ethics training would unlikely lead to a change in behavior, many favored the inclusion of ethical training into the educational curriculum of scientists working in DURC in order to increase awareness of the issue. In summary, interviewees of our study addressed various key issues regarding the problem of DURC. First, while underscoring the importance of generating scientific knowledge, they also identified inherent risks in research, including misuse by third parties. In order to minimize unintended risks and collateral harms in DURC, interviewees suggested developing and improving biosafety approaches, such as vaccinations against newly created viruses. They also suggested implementing more rigorous evaluative criteria for demarcating and prioritizing truly beneficial research, although they acknowledged the difficulty of such demarcation. Generally, a majority of interviewees saw the need to regulate DURC, preferably through self-regulatory measures. Nevertheless, many were in favor of an independent advisory board on a national level. Interviewees also argued that researchers need to be aware of DURC issues and self-reflect about their research projects. Lastly, interviewees saw a need to improve the way research results are communicated to the public. These results are consistent with the findings and normative suggestions of previous studies on DURC. For example, Imperiale and Casadevall have summarized the main issues of the current debate on DURC and proposed a path forward (Imperiale and Casadevall 2015). Their suggestions include the acknowledgment of the inherent risks in research, the development of new biosafety approaches, and the creation of a national board to vet issues related to research with dangerous pathogens. The need to create a national advisory board has also been voiced by the German Ethics Council in a recent report on DURC, as well as by the ESV (Palu 2014). In line with suggestions made by our interviewees, the German Ethics Council additionally recommended to increase awareness of DURC, for example, through ethics teaching (German Ethics Council 2014). The inclusion of biosecurity sections in publications and funding applications, as suggested by our interviewees, was recommended by Gronvall as another possibility to increase awareness as well as self-reflection (Gronvall 2013; German Ethics Council 2014). Our results, thus, suggest that at least part of the life science community has reflected about DURC and sees a need for action. They also suggest that involving scientists in the debate is essential for identifying options to tackle the dual-use problem. Further participation of other groups, such as ethicists, security experts, and representatives from the society in the debate will ensure that viable options are chosen, and may increase the acceptance of regulatory measures by the scientific community. 5. Limitations This study has several limitations. While the use of a qualitative method allowed exploring a multifaceted topic in depth, due to the qualitative design, statistically representative conclusions cannot be drawn. Furthermore, the study sample may not have represented the full range of scientists’ views from the field of microbiology and virology on this topic, since it was limited in regards to sample size, as well as geographical and cultural variation. However, since the interviewees came from various countries (see methods section) and worked at international world-renowned institutions and in laboratories with outstanding international reputation, we believe that we have gained valuable insights into the views and perspectives of leading life scientists. In addition, selection biases might have affected the recruitment process. The study was announced under the title of ‘the attitudes, views and opinions of Swiss researchers in the life sciences, working with pathogens, towards biosafety and biosecurity issues’ and could have attracted scientists who have previously reflected about the topic of DURC. In addition, it could have stimulated interviewees to reflect on the topic before the actual interview. However, we are convinced that even despite these limitations the obtained findings already show a variety of well-differentiated attitudes which add significant knowledge about how scientists perceive DURC. Further research is required to compare the views, perspectives, and attitudes on DURC of researchers not only in the context of GoF but also among researchers working in other dual-use sensitive branches of the life sciences such as biochemistry, mycology, phytopathology, toxicology, neuroscience, synthetic biology, and genetic engineering (e.g. CRISPR/cas9).10 In fact, having these fields of research with disruptive potential in the upcoming decades, their biosecurity implications need to be urgently investigated and assessed. 6. Recommendations Our research results show that there is a need for raising awareness and opening a public debate among researchers on the dual-use implications of their research, even in the fields of virology and microbiology. In fact, although interviewees showed a basic familiarity with the topics of biosecurity and dual-use, yet their views appeared rarely self-reflective of their own work and far from being consolidated into a robust and coherent ethical framework. This absence of self-reflectiveness and thorough ethical analysis is problematic for a twofold reason. First, it fails to provide researchers with well-reasoned normative considerations and operative codes of conduct in their everyday research. Second, it fails to provide researchers with a comprehensive analysis of the dual-use problem not only in the context of their field of research, but also in the context of research conducted in other fields of science, using different (bio)technologies, or in the light of future research and technological applications. Findings from various areas of science have proven that dual-use concerns do not solely pertain to research with pathogens but also apply to other branches of the life sciences such as synthetic biology, biochemistry, cell biology, mycology, phytopathology, toxicology, neuroscience, etc. (Atlas and Dando 2006; McLeish and Nightingale 2007; Bennett et al. 2009; Suk et al. 2011; Tennison and Moreno 2012; Ienca and Haselager 2016). Raising awareness on these issues in various contexts and providing researchers with a comprehensive framework on the normative and practical implications of DURC and biosecurity would facilitate the development of highly generalizable and multidisciplinary perspectives on DURC. Such perspectives are particularly important not only for recognizing cross-disciplinary preventive strategies and safeguards, but also for anticipating future concerns associated to emerging technologies with multiple applications (e.g. CRISPR/cas9). In order to produce pervasive and long-lasting effects on the scientific community and society at large, this process of raising awareness on DURC and biosecurity should be included into the educational program of students at the graduate and undergraduate level as well as into the practical lab training of young researchers (together with the teaching of biosafety). For example, biosecurity-focused courses could be incorporated into the standard educational curricula of students and young researchers in the life sciences as part of their courses on bioethics, research ethics, scientific methodology, etc. This is consistent with the Article 23 of the Universal Declaration on Bioethics and Human Rights, where it is suggested that ‘States should endeavor to foster bioethics education and training at all levels’ in order ‘to achieve a better understanding of the ethical implications of scientific and technological developments, in particular for young people’ (UNESCO 2005).11 Bioethics and biosecurity education could be implemented in various ways. A promising example is the University of Bradford’s free available online ‘Educational Module Resource’ (EMR) to assist university-level lecturers to incorporate ‘biosecurity and dual-use issues into their life science courses at a higher education level’. Researchers at the Landau Network-Centro Volta (LNCV) have even advanced the idea of ‘compulsory biosecurity education’, not only with the purpose of preventing and combating ‘present and future threats’ but also for ‘helping relevant actors to fulfil their legal, regulatory and professional obligations and ethical principles’ (Minehata et al. 2013). In practice, the implementation of such educational programs often meets well-rooted obstacles inherent in the reform of academic curricula. These include the lack of space in existing curricula, the lack of time and resources available to institutions to develop new curricula, as well as limitations peculiar of biosecurity education such as the absence of expertise and available literature on biosecurity education, and the ‘general doubt and skepticism about the need for biosecurity education on the part of educators and scientists’ (Mancini and Revill 2008; Minehata and Shinomiya, 2010). However, there are promising examples of universities having successfully included biosecurity modules into life science education (Engel-Glatter et al. 2016; Maksymovych et al. 2015). The process of implementing biosecurity modules into life science curricula may be facilitated through the inclusion of biosecurity material into core life science textbooks such as ‘Molecular Biology of the Cell’ (Alberts et al. 2015). In addition, based on the views of interviewees, we believe that awareness could and should be raised through the weighted use of regulatory interventions aimed at maximizing biosecurity without thereby harming the freedom of researcher. The hypothesis of self-regulation by the scientific community and its institutions was highly welcomed by the researchers involved in our study. It should be part of the regulatory approach, not only as a pragmatic measure for reducing administrative work, but also as an instrument to raise social responsibility among researchers. For example, as suggested by one interviewee, a biosecurity section could be incorporated into the standard templates of research project submissions to funding agencies in a similar manner as other research ethics requirement such ‘conflict of interest’, ‘authorship declaration’, ‘re-use of data’, etc. It could include detailed explanations of why the work should be undertaken despite the potential risks, as well as detailed information on the safety precautions (Gronvall 2013). In particular, a protocol for the disclosure of DURC similar to current protocols for the disclosure of conflict of interest might a cheap, easy-to-implement, and effective solution. Such institutional regulatory interventions could occur at the level of funding agencies, research centers, and publishers and have already been implemented in the UK by the Biotechnology and Biological Sciences Research Council (BBSRC), the Medical Research Council (MRC), and the Wellcome Trust.12 However, there is a need to elaborate which kind of knowledge and expertise is essential to interpret and evaluate these sections. However, relying on self-regulation only will most likely not result in the implementation of measures effectively addressing the dual-use problem. Our interviews confirm that the dual-use topic creates a conflict between scientific freedom and promotion of security (Miller and Selgelid 2007; Selgelid 2009). The scientific community has little interest in delaying or even stopping potentially successful biomedical research in the light of biosecurity threats. The success of any other regulatory model will depend on the active participation and support of the scientific community. In 2014, Smith and Kamradt-Scott (2014) suggested tying funding to oversight and using a multidisciplinary approach for reviewing dual-use research. This model, despite concerns raised about its implementation, was welcomed by many of our interviewees. There are many obstacles impeding the successful implementation of such a regulatory model (Ehrlich 2014; Lev and Samimian-Darash 2014). However, with the combined effort of all stakeholders, including the scientific community, the security community, and government officials, it may be a viable option for regulating DURC (Jacobsen et al. 2014). Finally, we are aware that our study results are not generalizable; nevertheless we believe that our results might light a passable path forward. 7. Conclusion Since the recent re-ignition of the debate on DURC, issues of biosafety and biosecurity have been widely discussed in the literature on microbiology/virology. Yet, a high degree of uncertainty remains with regard to the problems of risk-assessment, prevention, and on the implementation of regulatory measures. To enrich the understanding of these problems, facilitate the debate on the topic of dual-use, and elicit effective preventive measures and appropriate regulatory interventions, we have investigated the views and attitudes on DURC in the life sciences of international life scientists in Switzerland. In contrast to previous studies, which reported a limited awareness towards DURC among researchers, our results indicate that scientists are generally aware of the problem, but often lack the self-reflective attitude necessary for identifying and assessing dual-use aspects related to their own work. The interviews also show that, although freedom of research is widely considered a non-negotiable value, most researchers are supportive of some form of regulation of dual-use research, including via external advisory boards. Linking empirical data with existing normative literature, we provided recommendations for raising awareness on DURC in the life sciences among researchers and the public, for improving risk-assessment among individual scientists, and for the selection of preventive strategies. These include creating a solid and generalizable theoretical framework on DURC, updating educational curricula in the life sciences as to include or expand bioethics and biosecurity training, and pondering a weighted use of regulatory interventions aimed at maximizing biosecurity without thereby harming the freedom of research. In implementing such strategies, a cooperative effort should be pursued in a manner that actively involves scientists, ethicists, security experts, and regulatory agencies on an equal footing. Funding This study was funded by the Forschungsfonds of the University of Basel and the Käthe-Zingg-Schwichtenberg Fonds of the Swiss Academy of Medical Sciences. Footnotes 1 Potential risks and benefits associated with Highly Pathogenic Asian Avian Influenza A virus (HPAI) Hemagglutinin 5 Neuraminidase 1 (H5N1) Gain-of-Function (GoF) proposals with strains that are transmissible among mammals by respiratory droplets are reviewed on a case-by-case basis by the funding agency and the US Department of Health and Human Services (HHS). According to the framework, such proposals are only acceptable if certain criteria are met. For more information please see <http://osp.od.nih.gov/sites/default/files/resources/NSABB_Framework_for_Risk_and_Benefit_Assessments_of_GOF_Research-APPROVED.pdf> accessed 29 Apr 2017. 2 In 2004, the US National Academies published the highly influential report ‘Biotechnology Research in an Age of Terrorism’ in an attempt to address concerns over life science research and to find a formal definition of dual-use research (National Research Council 2004). The committee was headed by Professor Gerald R. Fink, Professor of Genetics, Whitehead Institute, MIT, Boston, MA. 3 Audacity software is a free audio editor and recorder, more information available on the website <http://audacity.sourceforge.net/ about/> accessed 29 Apr 2017. 4 Biosecurity can be defined as: ‘… the sum of risk management practices in defense against biological threats’, which includes aversion of biological terrorism and other disease breakouts’ (Meyerson and Reaser 2002). 5 The term tacit knowledge was first coined by Polanyi who defined it as: ‘…things that we know but cannot tell.’ Polanyi (1962). In the context of the dual-use problem the role of tacit knowledge has been discussed by Engel-Glatter (2014). 6 CRISPR/cas9 stands for Clustered Regularly Interspaced Short Palindromic Repeats/CRISPR-associated protein 9, a genome editing technique which allows for targeted deletion and insertion of genes. 7 The project discovered strains of a certain pathogen that lacked the markers essential for detection with one of several possible detection technologies. The pathogen is listed on the select agents and toxins list <http://www.selectagents.gov/SelectAgentsandToxinsList.html> accessed 29 Apr 2017. During the interview, the interviewee explained that the intention of the research was not to provide a roadmap for terrorists, but to show the limitations of a particular detection technology. 8 See the blog of virology Professor Vincent Racaniello <http://www.virology.ws/2013/01/24/headline-writers-please-take-a-virology-course> accessed 29 Apr 2017. 9 The topic is explored by Professor Vincent Racaniello <http://www.virology.ws/2012/03/01/influenza-h5n1-is-not-lethal-in-ferrets-after-airborne-transmission> accessed 29 Apr 2017. 10 The CRISPR/cas9 gene-editing technology provoked intense debates about whether its application in some fields is ethical. These include gene editing of human embryos (Baltimore et al. 2015), gene editing of insects and their unintended release into the environment (Akbari et al. 2015), as well as the possibility that the technology could be applied by people with malevolent intentions to alter and release dangerous pathogens. 11 United Nations Educational, Scientific and Cultural Organization (2005) ‘Universal Declaration on Bioethics and Human Rights’ <http://portal.unesco.org/en/ev.php-URL_ID=31058&URL_DO=DO_TOPIC&URL_SECTION=201.html> accessed 29 Apr 2017. 12 The joint policy is available at < https://wellcome.ac.uk/funding/managing-grant/managing-risks-research-misuse> accessed 26 Apr 2017. Acknowledgements We thank Marianne Weber and Elisabeth Brunner for their help with the transcriptions. We also would like to thank all researchers who participated in the study for their time and their willingness to share their thoughts on the issue. Lastly, we would like to thank Prof. Bernice S. Elger for her support. References Akbari O. S., Bellen H. J., Bier E. et al.   ( 2015) ‘ BIOSAFETY. Safeguarding Gene Drive Experiments in the Laboratory’, Science , 349/ 6251: 927– 9. Google Scholar CrossRef Search ADS PubMed  Alberts B., Johnson A., Lewis J. et al.   ( 2015) Molecular Biology of the Cell, 6th edn. New York: Garland Science. Atlas R. M., Dando M. ( 2006) ‘ The Dual-Use Dilemma for the Life Sciences: Perspectives, Conundrums, and Global Solutions’, Biosecurity and Bioterrorism-Biodefense Strategy Practice and Science , 4/ 3: 276– 86. Google Scholar CrossRef Search ADS   Baltimore D., Berg P., Botchan M. ( 2015) ‘ Biotechnology. A Prudent Path Forward for Genomic Engineering and Germline Gene Modification’, Science , 348/ 6230: 36– 8. Google Scholar CrossRef Search ADS PubMed  Bennett G., Gilman N., Stavrianakis A. et al. ( 2009) ‘ From Synthetic Biology to Biohacking: Are We Prepared?’, Nature Biotechnology , 27/ 12: 1109– 11. Google Scholar CrossRef Search ADS PubMed  Casadevall A., Shenk T. ( 2012) ‘ Mammalian-Transmissible H5n1 Virus: Containment Level and Case Fatality Ratio’, MBio , 3/ 2: e00054– 12. Google Scholar CrossRef Search ADS PubMed  Clapper J. R. ( 2016), ‘Statement for the Record Worldwide Threat Assessment of the US Intelligence Community, Director of National Intelligence’, Washington DC < www.dni.gov/files/documents/SASC_Unclassified_2016_ATA_SFR_FINAL.pdf> accessed 03 May 2017. Dando M. R., Rappert B. ( 2005) ‘Codes of Conduct for the Life Sciences: Some Insights from UK Academia’, Briefing Paper No 16 (Second Series). Strengthening the Biological Weapons Convention, pp. 1– 27. Department of Peace Studies, University of Bradford <http://www.brad.ac.uk/acad/sbtwc/briefing/BP_16_2ndseries.pdf> accessed 29 Apr 2017. Ehrlich S. A. ( 2014) ‘ H5N1: A Cautionary Tale’, Frontiers in Public Health , 2: 117. Google Scholar CrossRef Search ADS PubMed  Engel-Glatter S. ( 2014) ‘ Dual-Use Research and the H5n1 Bird Flu: Is Restricting Publication the Solution to Biosecurity Issues?’, Science and Public Policy , 41/ 3: 370– 83. Google Scholar CrossRef Search ADS   Engel-Glatter S., Cabrera L. Y., Marzoukic Y. et al.   ( 2016) ‘ Teaching Bioethics to a Large Number of Biology and Pharma Students Lessons Learned’, Ethics & Behavior , 1– 21, doi: 10.1080/10508422.2016.1196361. Enserink M. ( 2011) ‘Scientists Brace for Media Storm Around Controversial Flu Studies’, ScienceInsider, pp. 1–2. American Association for the Advancement of Science , Highwire Press: Washington, DC <http://news.sciencemag.org/scienceinsider/2011/11/scientists-brace-for-media-storm.html> accessed 03 May 2017. Enserink M. ( 2013) ‘ Dual-use Research. Dutch H5N1 Ruling Raises New Questions’, Science , 342/ 6155: 178. Google Scholar CrossRef Search ADS PubMed  Fears R., Ter Meulen V. ( 2015) ‘ European Academies Advise on Gain-of-Function Studies in Influenza Virus Research’, Journal of Virology , 90/ 5: 2162– 4. Google Scholar CrossRef Search ADS PubMed  Fouchier R., Osterhaus A. B., Steinbruner J. et al.   ( 2012) ‘Preventing Pandemics: The Fight Over Flu’, Nature , 481/ 7381: 257– 9. Google Scholar CrossRef Search ADS PubMed  Frank G. M., Adalja A., Barbour A. et al.   ( 2016) ‘ Infectious Diseases Society of America and Gain-of-Function Experiments With Pathogens Having Pandemic Potential’, Journal of Infectious Diseases , 213/ 9: 1359– 61. Google Scholar CrossRef Search ADS PubMed  German Ethics Council ( 2014) Biosecurity—Freedom and Responsibility of Research. Opinion, Summary And Recommendations, pp. 1–10 . Berlin: German Ethics Council <http://www.ethikrat.org/publications/opinions/biosecurity> accessed 03 May 2017. Gronvall G. K. ( 2013) ‘Working Paper on H5N1: A Case Study for Dual-Use Research’, pp. 1–29 . New York: Council on Foreign Relations <http://www.cfr.org/public-health-threats-and-pandemics/h5n1-case-study-dual-use-research/p30711> accessed 03 May 2017. Gurwitz D. ( 2014) ‘ Gene Drives Raise Dual-Use Concerns’, Science , 345/ 6200: 1010. Google Scholar CrossRef Search ADS PubMed  Herfst S., Schrauwen E. J., Linster M. et al. ( 2012) ‘ Airborne Transmission of Influenza A/H5N1 Virus between Ferrets’, Science , 336/ 6088: 1534– 41. Google Scholar CrossRef Search ADS PubMed  Ienca M., Haselager P. ( 2016) ‘ Hacking the Brain: Brain–Computer Interfacing Technology and the Ethics of Neurosecurity’, Ethics and Information Technology , 18/ 2: 117– 29. Google Scholar CrossRef Search ADS   Imai M., Watanabe T., Hatta M. et al.   ( 2012) ‘ Experimental Adaptation of an Influenza H5 HA Confers Respiratory Droplet Transmission to a Reassortant H5 HA/H1N1 Virus in Ferrets’, Nature , 486/ 7403: 420– 8. Google Scholar PubMed  Imperiale M. J., Casadevall A. ( 2015) ‘ A New Synthesis for Dual Use Research of Concern’, PLoS Medicine , 12/ 4: e1001813. Google Scholar CrossRef Search ADS PubMed  Jackson R. J., Ramsay A. J., Christensen C. D. et al.   ( 2001) ‘ Expression of Mouse Interleukin-4 by a Recombinant Ectromelia Virus Suppresses Cytolytic Lymphocyte Responses and Overcomes Genetic Resistance to Mousepox’, Journal of Virology , 75/ 3: 1205– 10. Google Scholar CrossRef Search ADS PubMed  Jacobsen K. X., Mattison K., Heisz M. et al.   ( 2014) ‘ Biosecurity in Emerging Life Sciences Technologies, a Canadian Public Health Perspective’, Frontiers in Public Health , 2: 198. Google Scholar CrossRef Search ADS PubMed  Kelle A. ( 2007) ‘Synthetic biology & biosecurity awareness in Europe’, Bradford Science and Technology Report No. 9, pp. 1–31. Vienna: Synbiosafe <www.synbiosafe.eu/uploads/pdf/Synbiosafe-Biosecurity_awareness_in_Europe_Kelle.pdf> accessed 03 May 2017. Kennedy D. ( 2008) ‘ Science and Security, Again’, Science , 321/ 5892: 1019. Google Scholar CrossRef Search ADS PubMed  Kong W., Liu Q., Sun Y. et al.   ( 2016) ‘ Transmission and Pathogenicity of Novel Reassortants Derived from Eurasian Avian-Like and 2009 Pandemic H1N1 Influenza Viruses in Mice and Guinea Pigs’, Scientific Reports , 6: 27067. Google Scholar CrossRef Search ADS PubMed  Lev O., Samimian-Darash L. ( 2014) ‘ Biosecurity Policy in the US: A Critical Assessment’, Frontiers in Public Health , 2: 110. Google Scholar CrossRef Search ADS PubMed  Lipsitch M., Galvani A. P. ( 2014) ‘ Ethical Alternatives to Experiments with Novel Potential Pandemic Pathogens’, PLoS Medicine , 11/ 5: e1001646. Google Scholar CrossRef Search ADS PubMed  Maksymovych I. S., Gergalova G. L., Komisarenko S. V. ( 2015) ‘ Some International Projects on Increasing Knowledge in Biosafety and Biosecurity: Efforts in Ukraine’, Journal for Veterinary Medicine, Biotechnology and Biosafety , 1/ 1: 39– 43. Mancini G., Revill J. ( 2008) ‘Fostering the Biosecurity Norms: Biosecurity Education for the next Generation of Scientists’ . Dando M., Martellini M. (eds.), pp. 1- 28. Bradford Disarmament Research Center (BDRC), University of Bradford, UK; Landau Network-Centro Volta (LNCV), Como, Italy, <sro.sussex.ac.uk/39517/1/Fostering.pdf> accessed 03 May 2017. McLeish C., Nightingale P. ( 2005) ‘The Impact of Dual Use Controls on UK Science: Results from a Pilot Study’, Science and Technology Policy Research (SPRU) . University of Sussex <http://www.sussex.ac.uk/spru/documents/sewp132.pdf> accessed 03 May 2017. McLeish C., Nightingale P. ( 2007) ‘ Biosecurity, Bioterrorism and the Governance of Science: The Increasing Convergence of Science and Security Policy’, Research Policy , 36/ 10: 1635– 54. Google Scholar CrossRef Search ADS   Meyerson L. A., Reaser J. K. ( 2002) ‘ Biosecurity: Moving toward a Comprehensive Approach’, Bioscience , 52/ 7: 593– 600. Google Scholar CrossRef Search ADS   Miller S., Selgelid M. J. ( 2007) ‘ Ethical and Philosophical Consideration of the Dual-use Dilemma in the Biological Sciences’, Science and Engineering Ethics , 13/ 4: 523– 80. Google Scholar CrossRef Search ADS PubMed  Minehata M., Shinomiya N. ( 2010) ‘ Japan: Obstacles, lessons and future’, In: Rappert B (ed.) Education and ethics in the life sciences: Strengthening the prohibition of biological weapons , pp. 93– 114. Canberra: Australian National University E Press <http://epress.anu.edu.au/education_ethics.html> accessed 23 Aug 2017. Minehata M., Sture J., Shinomiya N., et al.   ( 2013) ‘ Implementing Biosecurity Education: Approaches, Resources and Programmes’, Science and Engineering Ethics , 19/ 4: 1473– 86. Google Scholar CrossRef Search ADS PubMed  National Research Council ( 2004) Biotechnology Research in an Age of Terrorism, pp. 1–164 . Washington, DC: The National Academies Press <https://www.nap.edu/catalog/10827/biotechnology-research-in-an-age-of-terrorism> accessed 03 May 2017. National Research Council ( 2009) A Survey of Attitudes and Actions on Dual Use Research in the Life Sciences: A Collaborative Effort of the National Research Council and the National Association for the Advancement of Science, pp. 1–188 . Washington DC: The National Academies Press <https://www.nap.edu/catalog/12460/a-survey-of-attitudes-and-actions-on-dual-use-research-in-the-life-sciences> accessed 03 May 2017. Nature ( 2015) Special on Mutant Flu <http://www.nature.com/news/specials/mutantflu/index.html> accessed 29 April 2017. Nordmann B. D. ( 2010) ‘ Issues in Biosecurity and Biosafety’, International Journal of Antimicrobial Agents , 36 (Suppl 1): S66– 9. Google Scholar CrossRef Search ADS PubMed  NSABB ( 2007) Proposed Framework for the Oversight of Dual Use Life Sciences Research: Strategies for Minimizing the Potential Misuse of Research Information, pp. 1–55. National Research Council <http://osp.od.nih.gov/office-biotechnology-activities/nsabb-reports-and-recommendations/proposed-framework-oversight-dual-use-life-sciences-research> accessed 03 May 2017. Palu G. ( 2014) ‘ Regulating Dual-Use Research in Europe’, Science , 343/ 6169: 368– 9. Google Scholar CrossRef Search ADS PubMed  Petts J. ( 2008) ‘ Public Engagement to Build Trust: False Hopes?’, Journal of Risk Research , 11/ 6: 821– 35. Google Scholar CrossRef Search ADS   Polanyi M. ( 1962) ‘ Tacit Knowing—Its Bearing on Some Problems of Philosophy’, Reviews of Modern Physics , 34/ 4: 601– 16. Google Scholar CrossRef Search ADS   Reardon S. ( 2014) ‘Viral research moratorium called too broad’. Nature News, 23rd October 2014  <http://www.nature.com/news/viral-research-moratorium-called-too-broad-1.16211> accessed 23 Aug 2017. Schweizerische Eidgenossenschaft (Fassung vom 01 June 2015) Verordnung über den Umgang mit Organismen in geschlossenen Systemen. Bern. <https://www.admin.ch/opc/de/classified-compilation/20100803/index.html> accessed 03 May 2017. Selgelid M. J. ( 2009) ‘ Governance of Dual-Use Research: An Ethical Dilemma’, Bulletin of the World Health Organization , 87/ 9: 720– 3. Google Scholar CrossRef Search ADS PubMed  Smith F. L.3rd, Kamradt-Scott A. ( 2014) ‘ Antipodal Biosecurity? Oversight of Dual Use Research in the United States and Australia’, Frontiers in Public Health , 2: 142. Google Scholar PubMed  Stilgoe J., Lock S. J., Wilsdon J. ( 2014) ‘ Why Should We Promote Public Engagement with Science?’, Public Understanding of Science , 23/ 1: 4– 15. Google Scholar CrossRef Search ADS PubMed  Suk J. E., Zmorzynska A., Hunger I. et al.   ( 2011) ‘ Dual-Use Research and Technological Diffusion: Reconsidering the Bioterrorism Threat Spectrum’, PLoS Pathogen , 7/ 1: e1001253. Google Scholar CrossRef Search ADS   Sutton T. C., Finch C., Shao H. et al.   ( 2014) ‘ Airborne Transmission of Highly Pathogenic H7N1 Influenza Virus in Ferrets’, Journal of Virology , 88/ 12: 6623– 35. Google Scholar CrossRef Search ADS PubMed  Tennison M. N., Moreno J. D. ( 2012) ‘ Neuroscience, Ethics, and National Security: The State of the Art’, PLoS Biology , 10/ 3: e1001289. Google Scholar CrossRef Search ADS PubMed  The White House Office of Science and Technology Policy ( 2014) ‘Doing diligence to assess the risks and benefits of life sciences gain-of-function research’ <https://www.whitehouse.gov/blog/2014/10/17/doing-diligence-assess-risks-and-benefits-life-sciences-gain-function-research> accessed 29 April 2017. Wei K., Sun H., Sun Z. et al.   ( 2014) ‘Influenza A Virus Acquires Enhanced Pathogenicity and Transmissibility after Serial Passages in Swine’, Journal of Virology , 88/ 20: 11981– 94. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

Journal

Science and Public PolicyOxford University Press

Published: Feb 1, 2018

There are no references for this article.