Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Evolution and behavioural responses to human‐induced rapid environmental change

Evolution and behavioural responses to human‐induced rapid environmental change Introduction Almost all organisms on earth live in environments that have been altered, often drastically, by humans. Five major types of human‐induced environmental change have been identified: habitat loss/fragmentation, the spread of exotic species, harvesting by humans, pollutants of various sorts and, of course, climate change ( Rohr et al. 2006 ; Lockwood et al. 2007 ; IPCC 2007 ; Salo et al. 2007 ; Fabry et al. 2008 ). While it can be hard to precisely characterize the complex, multi‐dimensional nature of these environmental changes (e.g. in terms of their spatial scale, rapidity or evolutionary novelty) and while there are clearly differences between these five types of change, they share in common the fact that they are all important forms of human‐induced rapid environmental change (HIREC) that often put organisms into evolutionarily novel conditions that typically involve more rapid change than organisms have experienced in their evolutionary past ( Palumbi 2001 ). HIREC often alters species interactions and can cause species declines, including extinctions and range shifts ( Parmesan et al. 1999 ; Thomas and Lennon 1999 ; Walther et al. 2002 ; Thomas et al. 2004 ; Jackson and Sax 2010 ). These changes are, in turn, driving evolutionary changes, including adaptive evolutionary responses to HIREC, speciation and hybridization ( Hendry et al. 2011 ; Lankau et al. 2011 ). Some have projected that largely because of HIREC, a large proportion of the earth’s species will go extinct in the next 50–100 years ( Tilman 1994 ; Stork 2010 ). At the same time, however, many other species (e.g. invasive and urbanized pests) are thriving, even doing better than ever in the face of these same environmental changes. In many cases, species within the same genus, which seem identical until studied in detail, fall on opposite ends of this spectrum – with some species declining, threatened or endangered while others in the same genus are invasive pests (e.g. Rehage and Sih 2004 ; Rehage et al. 2005 ; D’Amore et al. 2009 ). A key issue is thus to understand, using an evolutionary and mechanistic perspective, why some species are doing so badly while others are doing so well in coping with HIREC. An integrative, evolutionary perspective is presented in Fig. 1 . In brief, the ability of species to cope with HIREC depends on the species’ traits (see Hendry et al. 2011 and Lankau et al. 2011 ). This idea is an extension of one of the most basic tenets of modern biology – that traits influence the fitness of individuals and success of species. If a species’ evolutionary history results in traits (both fixed and plastic) that are suitable for coping immediately with HIREC, then the species should be able to persist in the short term. If evolutionary history has also produced traits or conditions that facilitate a rapid evolutionary response (e.g. short generation times, suitable genetic variation in key traits; see Hendry et al. 2011 ), then the species can track HIREC by evolving new adaptations. Indeed, if a species has a long history of surviving rapid environmental changes and fluctuations, then human‐induced changes may not be particularly rapid or difficult to deal with by comparison. Thus, both the species’ evolutionary past (via its effect on current traits and genetic variation) and future evolution should influence its ability to persist in the long term. Given the number of large biotic and abiotic fluctuations and changes the biosphere has experienced since the emergence of life, and that all living organisms are descended from ancestors that weathered these changes, we can expect some resilience to change. The important questions are quantitative: how rapidly are humans altering the environment, and which species are likely to cope? 1 Past environments provide the evolutionary history that shapes sensory and cognitive processes controlling behaviour, as well as other traits and genetic variation. The fit of behaviour and other traits along with novel environments (that might match or mismatch past environments) influence individual fitness that governs population performance. Variations in fitness and genetic variation drive evolution that feeds back to determine future sensory and cognitive processes, behaviour, other traits and genetic variation. These, in turn, loop back to influence future fitness and population performance. Of the traits that influence success in response to HIREC, behavioural plasticity plays a particularly important role. A meta‐analysis of more than 3000 rates of recent phenotypic change suggested that most of the phenotypic changes observed in response to HIREC involve phenotypic plasticity rather than immediate genetic evolution ( Hendry et al. 2008 ). Furthermore, behaviour appears to be important in explaining variation in species’ abilities to cope well with HIREC, with maladaptive behaviours, such as consumption of novel toxic prey or failure to avoid novel predators, implicated in species declines ( Schlaepfer et al. 2002, 2005, 2010 ; Buchholz 2007 ; Sih et al. 2010 ) and more appropriate behavioural responses facilitating species invasions ( Holway and Suarez 1999 ; Sih et al. 2010 ). Behaviour also interacts with evolution in two important ways. First, following the framework in Fig. 1 , key behaviours have presumably been shaped by past evolution. Second, suitable behavioural plasticity can allow and shape future evolution of both the behaviour itself and other adaptive responses (e.g. life history or morphological responses) to HIREC ( Baldwin 1896 ; Wcislo 1989 ; West‐Eberhard 2003 ; Crispo 2007 ; Ghalambor et al. 2007 ). In essence, behaviour can facilitate the move from one adaptive peak (for past environments) to another (for new environments) or the ability of a species to track a shifting adaptive landscape ( Price et al. 2003 ; West‐Eberhard 2003 ). We next review examples of behavioural responses to some main categories of HIREC. We do not attempt to provide a comprehensive review of relevant literature. Instead, our goals in the next section are to: (i) introduce readers to some main issues and patterns; (ii) organize the diverse set of issues associated with HIREC into four main categories; and (iii) highlight the fact that for each type of HIREC, some organisms are not coping well, while others are thriving. After reviewing behavioural responses to HIREC, we then present a conceptual overview on how we might explain the variation in response to HIREC. Behavioural responses to HIREC Behavioural response to HIREC fall into several main categories: (i) avoiding or coping with novel enemies (e.g. predators, parasites, diseases; including humans); (ii) adopting and utilizing novel resources or habitats; (iii) avoiding or coping with novel abiotic stressors (e.g. pollutants); and (iv) adjusting to changing spatiotemporal conditions (e.g. habitat fragmentation, climate change). Below, we briefly review some examples of each with an emphasis on the variation in response (see Tuomainen and Candolin in press for a more comprehensive review of this literature). Some individuals or species exhibit maladaptive responses that can have serious detrimental consequences, while others show interestingly ‘adaptive’ responses despite the fact that HIREC is putting them into evolutionarily novel conditions. Risks and resources The spread of exotic species has exposed many animals to new enemies – new predators, parasites or diseases ( Lockwood et al. 2007 ). As with many other changes discussed here, humans did not invent this form of change, but globalization means that species can invade more rapidly and from more distant sites than before ( Hulme 2009 ). In some cases, the key new species is humans themselves. Animals sometimes show strikingly maladaptive responses to these novel enemies and thus suffer heavy mortality ( Cox and Lima 2006 ; Sih et al. 2010 ). Notable examples include the lack of behavioural responses of island species (some of which are now extinct) to humans ( Fritts and Rodda 1998 ; Cox and Lima 2006 ), of marsupial prey to foxes in Australia ( Kinnear et al. 2002 ), of birds and other vertebrate prey to brown tree snakes ( Wiles et al. 2003 ) and of various aquatic prey to introduced predatory fish ( Knapp et al. 2001 ). In contrast, other prey have shown either innate recognition with suitable responses to novel predators (e.g. Peluci et al. 2008 ; Epp and Gabor 2008 ; Rehage et al. 2009 ) or rapid adaptive learning about novel predators including humans ( Knight et al. 1987 ). Similarly, some animals do not appear to respond appropriately to exotic diseases (e.g. do not avoid diseased conspecifics and do not reduce aggregation that can elevate transmission rates, Han et al. 2008 ), while other animals do change their behaviour in ways that reduce disease transmission ( Behringer et al. 2006 ). The aforementioned examples involve inappropriately weak responses to dangerous enemies. The flip side also applies – animals can over‐respond to novel organisms that are not actually particularly dangerous. For example, many species apparently treat humans as potential predators ( Frid and Dill 2002 ; Beale and Monaghan 2004 ) even when humans pose little risk. Examples include numerous species probably over‐avoiding habitats used by humans, including areas used for eco‐tourism and human recreation ( Brown and Stevens 1997 ; Gander and Ingold 1997 ; Dyck and Baydack 2004 ; Gilroy and Sutherland 2007 ; Andersen and Aaars 2008 ). Still other species fail to exploit suitable habitat after humans extirpate their predators ( Blumstein 2006 ). A related phenomenon is the adoption of novel resources provided directly or indirectly by humans. For example, crops or ornamental plants represent a massive influx of often high‐quality food for herbivores. Notably, while some herbivores shift to utilize these novel plants and thus in some cases become pests, many other potential herbivores (e.g. herbivorous insects found in the same areas, often feeding on related plants) do not use the new hosts ( Samways and Lockwood 1998 ; Bossart 2003 ). Similarly, some gulls use garbage dumps, and some birds and squirrels avidly visit bird feeders ( McKinney 2002 ), while others do not. The adoption of new hosts by some parasites, but not others, can depend on differences in behavioural plasticity (e.g. Bush 2009 ). Of course, some novel, apparent resources are actually toxic and should be avoided. For example, some predators are suffering negative impacts as a result of consuming exotic, toxic cane toads in Australia ( Hagman et al. 2009 ). These negative impacts can reflect not just an inappropriate tendency to consume exotic, toxic prey, but also inappropriate behavioural means of handling toxic prey. Adders, for example, are well adapted to handling toxic frogs by waiting for their toxins to degrade; however, this tactic is ineffective against cane toads because their toxins degrade slowly ( Hagman et al. 2009 ). Habitat loss and fragmentation Habitat loss (habitat degradation, fragmentation) including urbanization/suburbanization, deforestation and conversion of wildlands to agriculture is likely the most important cause of species declines and extinctions ( Pimm and Raven 2000 ). Besides reducing habitat availability per se , habitat loss and fragmentation also cascade through species interactions, leading to increased competition ( MacDonald et al. 2004 ) or increased predation ( Schlaepfer et al. 2002 ) in the remaining habitat. Sources of fragmentation (e.g. roads – Laurance et al. 2008 ; Shepard et al. 2008 ) also impose barriers to adaptive movement through the landscape, which can reduce fitness. It is worth noting, however, that except when the habitat change is catastrophic (e.g. replacing a forest with a shopping mall), many forms of habitat change can actually provide new habitat for some taxa. Adoption of this new habitat (e.g. moving into urban/suburban areas) can be critical for long‐term species persistence. For example, use of human‐created stormwater ponds or artificial wetlands can be crucial for persistence of amphibians ( Brand and Snodgrass 2010 ). For species willing to move in with us, urban/suburban habitat often offers reduced predation risk and high food availability ( Gilroy and Sutherland 2007 ). As noted previously, crop fields represent a bonanza of high‐quality food. Given that habitat change can often be either good or bad for any given species depending on how they respond to the novel habitat, a key issue is to understand the variation in response. Why have some species become urbanized, while others have not ( Blair 2001 ; Sol et al. 2002 ; Hamer and McDonnell 2008 )? Pollutants We use the term ‘pollutant’ in a broad sense that includes chemical contaminants, but also changes in other abiotic conditions such as noise or light levels. Each of these has been shown to have adverse effects on organisms, often mediated by behaviour. While much of the work on chemical contaminants has focused on lethal effects or substantial developmental disruptions associated with relatively high chemical concentrations, much lower concentrations of metals or pesticides and other endocrine disruptors can alter a diverse array of behaviours, including predator–prey behaviours, mating and social behaviours, communication and learning ( Clotfelter et al. 2004 ; Leduc et al. 2004 ; Zala and Penn 2004 ; Rohr et al. 2004 ; Fisher et al. 2006 ). Indeed, just as predators have major impacts on communities both through predation per se and nonconsumptive effects (e.g. because of costly shifts in prey behaviour; Preisser et al. 2005 ), chemical contaminants have impacts both through direct lethal effects and via changes in behaviour ( Rohr et al. 2006 ). Species (and genotypes) differ in their ability to cope physiologically (e.g. differences in LC50s or LD50s) and behaviourally (e.g. Andres et al. 2000 ; McComb et al. 2008 ). Human activities often change environmental light levels (i.e., light pollution). On land, artificial light can cause sea turtle hatchlings to fatally move inland rather than to the ocean ( Tuxbury and Salmon 2005 ) and can result in reduced activity (and thus reduced foraging success) at night ( Kotler 1984 ; Baker and Richardson 2006 ), or shifts in the timing of activity ( Miller 2006 ). In the water, human activities often cause increased turbidity, which can reduce feeding rates ( Stuart‐Smith et al. 2004 ; Ljunggren and Sandstrom 2007 ) and disrupt mating patterns ( Seehausen et al. 1997 ; Candolin 2009 ). Importantly, this anthropogenic change can either have negative or positive impacts; for example, increased turbidity can provide enhanced safety for prey ( Vogel and Beauchamp 1999 ; Ferrari et al. 2010a ). Finally, humans make noise, which can have detrimental effects on animals via, for example, disruptions in acoustic communication. Notably, some (but not all) animals change their calls or calling patterns in an apparent attempt to maintain effective communication despite the disruptive background noise ( Miller et al. 2000 ; Slabberkoorn and Peet 2003 ; Foote et al. 2004 ; Sun and Narins 2005 ). Climate change and shifts in timing Among the many ways that global warming can impact organisms is the need to shift the seasonal timing of major life history events. With global warming, spring arrives earlier than before. Accordingly, many animals have shifted to begin breeding earlier, often by arriving at breeding grounds earlier in the spring (e.g. Forister and Shapiro 2003 ; Berteaux et al. 2004 ; Both et al. 2004 ; Pulido 2007 ; Lyon et al. 2008 ). Notably, for many taxa, these changes in life history timing are so rapid that they almost certainly represent primarily behavioural plasticity rather than evolutionary change ( Grieco et al. 2002 ; Gienapp et al. 2008 ). However, as with the other responses to HIREC discussed previously, not all animals are responding with suitable shifts. For example, studies have documented variation both between and within species in propensity or rate of shift in seasonal timing of reproduction ( Lyon et al. 2008 ; Reed et al. 2009 ). Animals that have not made an appropriate shift can suffer substantial population declines ( Both et al. 2006 ; Ludwig et al. 2006 ). Another notable effect of climate change is the effect of warming on sex ratio in taxa that exhibit temperature‐dependent sex determination. In many reptiles, global warming could result in highly biased sex ratios that could negatively influence species persistence (e.g. Mitchell et al. 2008 ). A few studies have found that animals have responded to this risk by shifting their nest site choice (e.g. by preferring shaded nest sites in warmer regions; Doody et al. 2006 ). However, it is unclear whether other species are changing maternal behaviour rapidly enough to keep up with climate change ( Morjan 2003 ). The aforementioned examples illustrate the striking variation in the way species respond to human‐altered environments. The ability to display appropriate behaviours is crucial in determining immediate fitness and perhaps short‐term persistence in the face of HIREC. The key issue then is to better understand why some animals behave appropriately under novel conditions, while others do not. Thus, in the following sections, we (i) suggest a mechanistic, sensory ecological approach for studying and understanding variation in behavioural responses to HIREC, emphasizing evolutionary history and especially the match or mismatch between past and new environments; (ii) present a theoretical framework (detection theory) for making detailed predictions about how organisms should respond to novel stimuli and for evaluating them experimentally; (iii) discuss how learning can affect responses to HIREC when organisms’ initial behaviours are inappropriate and note the role of evolutionary history on this aspect of environmental change; and (iv) note some complexities that might constrain the ability of species (or individuals) to exhibit suitable responses to HIREC. Finally, we discuss how behavioural responses to HIREC might affect longer‐term evolution and how both behaviour and evolution might influence long‐term species persistence. Explaining variation in behavioural responses to HIREC A conventional wisdom is that behavioural flexibility per se helps species cope with HIREC. For example, the literature on traits associated with invasive species (which are thriving with HIREC) suggests that successful invaders might often be generalists with high phenotypic plasticity and behavioural flexibility ( Lodge 1993 ; Richards et al. 2006 ; Ghalambor et al. 2007 ). Indeed, experimental studies comparing closely related taxa that are invasive versus restricted in range have found evidence that invasive species are more flexible in their response to novel foods, predators or competitors (e.g. Rehage and Sih 2004 ; Rehage et al. 2005 ). In addition, broad, comparative analyses show that invasiveness in birds and mammals appears to be related to behavioural flexibility that is, in turn, associated with larger brains ( Sol 2005 ; Sol et al. 2008 ). Even more broadly, species or individuals that respond well to HIREC might either have a ‘personality type’ or behavioural syndrome (flexible, exploratory, bold, or aggressive behavioural tendencies) that makes them better suited to coping with novel conditions ( Sih et al. 2004 ; Cote et al. in press ) or might exhibit greater within‐species variation in behavioural tendencies that allows the species to cope well with environmental variation ( Fogarty et al., in press ). Conversely, species with low flexibility (e.g. specialists) appear to be often highly vulnerable to HIREC ( Colles et al. 2009 ). Another common idea is that an organism’s evolutionary history plays an important role in explaining its response to novel conditions. At one level, differences between species in behavioural flexibility (or in plasticity, in general) presumably depend on the species’ evolutionary history. Species that have evolved with high spatiotemporal variability should be more likely to be plastic ( Mayr 1974 ; Walther et al. 2002 ; Gabriel et al. 2005 ). With regard to particular traits and particular aspects of the organism’s evolutionary history, a key might be the match (or mismatch) between the new environment and the organisms’ traits that were shaped by past selection ( Ghalambor et al. 2007 ; Sih et al. 2010 ). For example, Blair (2001) suggested that (i) ‘urban avoiders’ that are sensitive to human disturbance might be species from habitats that are least like urban areas (e.g. old forests), (ii) ‘urban adapters’ are species that previously used forest edges along with associated open areas and have thus evolved the flexibility to use a diverse range of habitats, and (iii) ‘urban exploiters’ tend to be generalist omnivores that are ‘pre‐adapted’ to live in human structures (e.g. rats, mice, pigeons). Along parallel lines, Fahrig (2007) suggested that understanding patterns of patch quality in a species’ evolutionary past could help us predict their abilities to cope with habitat fragmentation. Species that evolved with low‐risk matrix habitats (areas between main habitats) move readily between patches and are thus susceptible to high mortality while moving through matrix habitat that is now much more risky owing to habitat degradation. Conversely, species that evolved with high‐risk matrix habitats avoid moving between habitat patches and are thus susceptible to low immigration/colonization success in newly fragmented habitats. A key challenge for using evolutionary history to explain behavioural responses to HIREC, however, is the fundamental difficulty of predicting how organisms should respond to novel conditions that they have not seen in their evolutionary past. One simple option for models of plasticity in ‘extreme environments’ is to assume linear extrapolation of the reaction norm ( Chevin and Lande 2009 ). Alternatively, Ghalambor et al. (2007) noted that, while a reaction norm could be held taut like a string by selection across the range of current and relevant past environments, there is little more than pleiotropy holding it in place outside this range. Because of lack of selection, we have little a priori basis for predicting what phenotype will be expressed in novel conditions. While this view provides the insight that many systems might harbour hidden, ‘pre‐adaptive’ genetic variation that can facilitate future adaptive evolution in response to HIREC, it makes few concrete predictions about why some species exhibit immediate appropriate behavioural responses to HIREC, while others do not. For this reason, we suggest that it will be valuable to apply a more mechanistic approach incorporating sensory and cognitive ecology of behavioural responses. The sensory/cognitive ecology of behavioural responses to HIREC We organize our discussion using the common approach in behavioural ecology of analysing behavioural processes as the outcome of three sequential stages: encounter (stage 1), detect, recognize and evaluate (stage 2), and respond (stage 3). Differences in overall response to HIREC can be understood by looking at variation (among individuals, populations or species) in exposure and response to HIREC in each of these three stages. While some organisms are not suffering negative impacts from HIREC because, by chance or choice, they simply do not encounter it, we focus on organisms that are, in fact, exposed to HIREC. What factors explain why some understand (detect, recognize and suitably evaluate) novel conditions, while others do not? For those that appear to recognize a novel stressor, why do some respond appropriately while others do not? Why do some prey recognize and respond appropriately to exotic predators while other prey do not? Why do some consumers recognize and successfully utilize new resources while others do not? Finally, how and why have some organisms, but not others, successfully evaluated climate change and shifted their phenologies accordingly? The behavioural approach that we espouse draws from the literature on evolutionary traps ( Schlaepfer et al. 2002, 2005, 2010 ; Robertson and Hutto 2006 ; Gilroy and Sutherland 2007 ; Part et al. 2007 ). The basic idea is that organisms have evolved cue–response relationships that are adaptive in their natural environments. Following Cosmides and Tooby (1987) , Schlaepfer et al. (2005, 2010) refer to these cue–response, decision‐making rules as Darwinian algorithms. Animals use these cue–response algorithms to evaluate habitat quality, food quality, danger, or the appropriate time to begin breeding and respond accordingly. A problem arises, however, if under novel conditions, the previously adaptive cue–response relationship now results in a misevaluation of the environment or an inappropriate response. An ecological trap is the particular case of an evolutionary trap that involves habitat use – where organisms choose poor‐quality habitats (sinks) over better available habitats because of errors in the evaluation of habitat quality. Many examples of traps involve maladaptive habitat use (see Schlaepfer et al. 2002 ). Grassland birds nest in pastures that get mowed before chicks can fledge. Vultures are electrocuted when they perch on electric lines. Insects of various sorts attempt to oviposit on concrete or glass buildings that have visual properties that resemble that of water ( Kriska et al. 2008 ). Sea turtle hatchlings move away from their suitable ocean habitat when they follow human lights at night ( Tuxbury and Salmon 2005 ). Other examples involve attraction to inappropriate foods. Sea turtles die when they consume plastic refuse. Humans crave fatty foods. Yet other examples involve attraction to inappropriate mates. A classic, widely seen photograph shows a male beetle with its genitalia extended, apparently trapped into attempting to mate with a brown beer bottle. Using a similar principle, biological control programs use sex pheromones as an evolutionary trap to kill pest insects. Other organisms, however, respond appropriately to novel cues. Our goal is to develop a conceptual framework for thinking about how evolutionary history produces cue–response relationships that explain relative ability to cope with HIREC. We split our analysis into the two relevant stages stated above (assuming that organisms have indeed encountered HIREC). Stage 2 involves understanding the cues that organisms use to evaluate environmental conditions, and stage 3 focuses on how organisms then respond to those cues. Stage 2: Detecting, recognizing and evaluating novel environmental conditions Our sensory/cognitive framework for explaining variation in how organisms evaluate a novel situation includes the following key points. Animals are more likely to respond to a ‘novel’ cue if it is similar to a cue that its ancestors responded to in the evolutionary past. We refer to this as the ‘cue similarity’ hypothesis (see Fig. 2A ). Of course, ‘similarity’ is relative: whether a particular cue elicits a response depends, among other things, on whether organisms use broad, generalized criteria ( Fig. 2C ) as opposed to more specialized, precise criteria ( Fig. 2B ) for evaluating environmental conditions. While the notion of understanding how animals ‘think about’ their environments may seem suspect to many ecologists, it is a main topic of study for animal behaviourists and is often crucial for properly understanding behaviour ( Shettleworth 2001 ; DeWaal 2008 ). 2 Influence of cue similarity, and use of general versus specialized cues, on recognition of novel cues. Shown are two‐dimenstional cue spaces. E = cues produced by a stimulus from a species’ evolutionary past; N = cues produced by a novel stimulus. The circle or oval around each E is the cue space that elicits a response. (A) The new cue is similar to cues from the species’ past, and the focal species uses specific cues to elicit a response. The species recognizes the novel stimulus. (B) The new cue is not similar to cues from the past, and the species uses specific cues. The species does not recognize the novel stimulus. (C) New and past cues are dissimilar, but because the species uses general cues, it recognizes the new stimulus. (D) Prey recognition of a predator depends on how they use multiple cues. Prey could be alarmed by either A or B (above a threshold level for either) or might require cue A and B to be alarmed. Adapted from Sih et al. (2010) . To flesh out an example of the cue similarity hypothesis, consider the issue of prey responses to novel predators. Invasive, exotic predators often have major negative impacts on naïve prey apparently because these prey often exhibit weak antipredator responses ( Cox and Lima 2006 ; Sih et al. 2010 ). Other naïve prey, however, respond appropriately enough to survive encounters with predators that they have never seen before – neither within their lifetime nor in their known evolutionary past. To explain this variation in prey response to novel predators, a first, and perhaps obvious, point is that prey response to novel predators should depend on whether the novel predators present cues that are similar to cues from predators that the organisms have experienced in their evolutionary past. Although the ecologist may know, for example, that a particular predatory fish is novel, if prey already coexist with a similar predatory fish, prey will likely show adaptive antipredator responses (e.g. Ferrari et al. 2007 ). In contrast, when exotic predatory fish invade (or are released) into habitats that lack any predatory fish, prey often show little or no adaptive response. Several studies indeed show that the magnitude of prey response to cues from novel predators is proportional to the similarity of the novel predators to native predators ( Griffin et al. 2001 ; Ferrari et al. 2007 ; Stankowich and Coss 2007 ). While cue similarity might often be related to taxonomic similarity, this need not always be true. Predators that are taxonomically related might put out quite different cues. For example, some prey avoid large, actively searching predators ( Dill 1974 ; Sih 1986 ). These prey might not respond adaptively to a closely related, but exotic, smaller ambush predator. Conversely, predators that are distantly related might release similar cues. Prey that avoid large, active native predators should also respond well to even distantly related exotic predators that have a similar appearance and predation style. The ‘cue similarity’ hypothesis clearly invokes an important role for evolutionary history and the match versus mismatch between past environments and the novel situations associated with HIREC. If prey have evolved with similar predators, they are likely to recognize a taxonomically exotic predator. Indeed, if prey use generalized cues to assess risk and if they have evolved with any predators, they appear to be pre‐adapted to recognize a broad range of exotic predators ( Blumstein 2006 ). Cox and Lima (2006) took this a step further to suggest that a large‐scale pattern like the fact that freshwater prey tend to be more susceptible than terrestrial prey to negative impacts from exotic predators may be explained by general habitat‐driven differences in their evolutionary history with predators. Many freshwater habitats are ephemeral and/or fragmented (e.g. ephemeral or isolated ponds). Prey in these habitats typically lack an evolutionary history with vertebrate predators and, thus, often show little effective response to the introduction of exotic predatory fish. In contrast, until recently, terrestrial prey have lived in less ephemeral, less fragmented habitats that have generally contained important predators. As a result, these prey more often recognize and respond well to exotic predators. Beyond cue similarity per se , prey responses to particular novel cues depend also on whether prey use broad, general cues as opposed to narrow, precise, specific cues to gauge risk. An example of a general cue is a chemical cue released by damaged conspecific (or heterospecific) prey ( Chivers and Smith 1998 ; Ferrari et al. 2010b ). Prey that use these general ‘alarm cues’ will respond to any ‘sloppy’ predators, including exotic ones. However, prey that use general cues to identify risk might also inappropriately exhibit antipredator responses to nonpredatory sources of damage. Somewhat more specific, but still quite broad, is the response of many aquatic prey to fish chemical cues (e.g. Binckley and Resetarits 2003 ; Sih et al. 2003 ). Although prey using these cues should respond to exotic predatory fish, they might also respond unnecessarily to nonpredatory fish ( Langerhans and DeWitt 2002 ). Other general cues include avoidance of any large moving animal ( Dill 1974 ; Sih 1986 ; Wisenden and Harter 2001 ) or avoidance of particular habitats even without direct predator cues ( Verdolin 2006 ). In contrast to such general cues, many prey respond to either a more specific cue (e.g. Kotler et al. 1991 ; Jedrzejewski et al. 1993 ; Thorson et al. 1998 ) or a mixture of multiple cues (e.g. simultaneous detection of chemical cues from specific predators and damaged prey – Sih 1986 ; Chivers et al. 2002 ; Schoeppner and Relyea 2005 ; Brodin et al. 2006 – or both chemical and visual cues from a specific predator – Amo et al. 2004 ). Prey that rely on more specific cues or that require both cue A and cue B to elicit an antipredator response (see Fig. 2D ) should be more vulnerable to being ‘trapped’ into not responding to an exotic predator. Notably, studies have found within‐species variation in how prey respond to the same risk‐related cues ( Sih et al. 2003 ; Brodin et al. 2006 ). Quantifying cue similarity between exotic and native predators, and the variation in the type and precision of cues used by native prey, should help explain variation in immediate responses to exotic predators. To emphasize, in this stage, we are only looking at factors that might explain whether organisms respond to novel situations or not. We will consider variation in response suitability in a subsequent section. Evolutionary history can also help us understand why some organisms use general versus specific cues. Again, we use prey responses to exotic predators as a format for explaining some general ideas ( Sih 1992 ; Sih et al. 2010 ). Prey that use more general cues are not only more likely to respond to a novel predator but also more likely to respond unnecessarily to nondangerous stimuli. In contrast, prey that rely on more specific cues are less likely to waste time and effort with unnecessary responses, but run the risk of not responding to an actually dangerous novel predator. The balance between these competing selection pressures depends in part on their relative benefits and costs. If in their evolutionary history, prey have had effective means of escaping attack by their native predators, then the cost of using specific cues, and thus ignoring potential danger until the last second, has been relatively low. The benefit of using specific cues should be particularly large if the costs of over‐responding to risk are high (e.g. if food is scarce and only found outside of refuges). Thus, under these conditions, we expect prey to evolve the use of specific cues for evaluating risk. In contrast, prey that have difficulty escaping predators should favour more general cues because they cannot afford to make the mistake of under‐responding to predators. More generally, in an uncertain environment with imprecise cues, asymmetries in the costs and benefits of under‐responding versus over‐responding should help explain the use of general versus specific cues. Our basic sensory/cognitive framework for stage 2 can also be applied to other issues about responses to HIREC. For example, all around the world, humans have provided large amounts of novel resources for herbivores in the form of crops or ornamental plants. Only a small proportion of all the herbivores that encounter these novel plants shift to utilize them. Those that have made the shift sometimes become economically important pests. Although many studies have focused on the ecology of crop pests, surprisingly few studies have examined why some herbivores use a particular crop while others, sometimes closely related herbivores, do not (but see Samways and Lockwood 1998 ). A better understanding of this issue could be useful for pre‐emptive pest management. Our framework suggests that a first step is to look at the cues herbivores use to detect and recognize a plant as a suitable host. A large literature shows that herbivore diet choice often depends on a complex blend of chemical (and in some cases, visual or textural) attractants and deterrents ( Dethier 1980 ; Futuyma and Moreno 1988 ). Interestingly, if they lack the appropriate attractants or mix of attractants, herbivores often ignore plants that they could thrive on ( Dethier 1980 ; Bruce et al. 2005 ). Our framework predicts that herbivores should be more likely to shift to use novel plants if: (i) the novel plants share similar attractants/deterrents as hosts already used by a given herbivore; or (ii) the herbivore is a generalist in the sense of having broad, catholic criteria for plant acceptability as opposed to requiring a specific, narrow set of cues (see Fig. 2 ). These points may seem obvious; however, the framework still has value in focusing attention on behavioural mechanisms and provides a good starting point for further refinement in particular systems. More broadly, the basic framework of focusing on both cue similarity and generalized/specialized use of cues can help us understand variation in adoption of any novel potential resource – good or bad. For example, it can help us understand which parasites shift to use novel hosts, which predators consume exotic prey including toxic ones like cane toads and which consumers are susceptible to being ‘trapped’ into eating inappropriate foods like plastic garbage. The framework also suggests issues to study to explain why some organisms use human‐altered habitats and why others do not ( Gilroy and Sutherland 2007 ). Finally, a cue‐based approach can help explain variation in response to climate change. For example, while some birds have shifted their seasonal timing of breeding, others have not ( Lyon et al. 2008 ). A simple cue‐based idea is that organisms that rely primarily on photoperiod cues to set their seasonal timing will not show a plastic response to changing temperatures, while species that use temperature per se , or a combination of temperature and photoperiod cues, should exhibit a more rapid plastic response. Evolutionary history, including both adaptation and the possibility of nonadaptive phylogenetic inertia, can help us understand variation among species in their cue–response systems relative to seasonal timing of reproduction ( Hahn and MacDougall‐Shackleton 2008 ). A quantitative framework for evaluating responses to novel cues Above, we outlined general concepts regarding the importance of various features of cues and understanding how they translate from the environments in organisms’ evolutionary histories to those affected by HIREC. We next outline a framework that empiricists and theoreticians can use to quantify these concepts: signal detection theory, originally proposed by Tanner and Swets (1954) . Because this theory has been extended well beyond the interpretation of signals per se (e.g. when psychologists study memory), following Macmillan and Creelman (2005) , we simply refer to the body of theory as ‘detection theory’. This theory has helped ecologists clarify a number of issues, from trade‐offs involved in foraging ( Rechten et al. 1983 ) and antipredator behaviour ( Ings and Chittka 2008 ) to the maintenance of phenotypic plasticity ( Getty 1996 ). It has also helped address a broad range of questions that play a major role in understanding responses to HIREC: Does pollution change response rates by overwhelming the animal’s sensory systems or by changing the perceived costs and benefits of behaviours ( Bushnell 1997 )? How are cues integrated ( Massaro and Friedman 1990 )? How do changes in cue intensity affect detection/recognition versus response ( Terman 1970 )? What components of the sensory and decision‐making process change after learning ( Friedman et al. 1968 )? What roles do inattention and fatigue play ( Benjamin et al. 2009 )? How do the organisms seem to weigh the relative costs of Type 1 and Type 2 error ( Getty and Krebs 1985 )? Thus, by helping us model and evaluate animals’ decisions, detection theory can also help explain and predict variation in responses to HIREC. Detection theory provides two sets of methods of interest to ecologists studying responses to HIREC: a family of statistical modelling techniques that enable inferences about animals’ decision‐making processes from experimental data, and a way of determining optimal behaviour and estimated fitness under information constraints. Ecologists often use this second form of detection theory on its own ( Getty 1996 ; Rodríguez‐Gironés and Lotem 1999 ; Trimmer et al. 2008 ), but it is most powerful when combined with experiments that show how organisms actually behave (e.g. Getty and Krebs 1985 ). In detection theory models, the organism’s response is ultimately determined by where perceived cue intensity falls in relation to one or more thresholds. After repeatedly exposing organisms to different cues (or to different combinations of cues) under different conditions and observing their overall response rates, the researcher then plots a receiver operating characteristic (ROC) curve through the observed response rates ( Fig. 3 ). The shape of this curve allows us to infer two parameters – discriminability and bias – that describe decision‐making mechanisms ( Fig. 4 ). 3 Three receiver operating characteristic (ROC) curves with discriminabilities of 0, 0.5 and 2. When discrimination is impossible (discriminability = 0), stimuli cannot affect behaviour, and the rate of successful detections equals the background response rate (1:1 line). As discriminability improves, these two rates can diverge and the ROC curve bows up and to the left. Organisms’ response probabilities are also influenced by their response bias – the level of confidence required to induce a response – which depends on the slope. The ‘X’ marks the organism considered in Figs 4 and 5 , with discriminability = 2 and bias = −0.4. 4 The inferred distributions of perceived intensities from stimuli (right curve) and nonstimuli (left curve) for the organism marked in Fig 3 . Discriminability is the relative distance between the curves and corresponds to low overlap, while bias is the strength of evidence required to provoke a response, corresponding to the relative height of the curves. The hatched areas under each curve correspond to the organism’s response rate for the corresponding scenario (i.e., its x and y coordinates in ROC space). Discriminability tells us how well the organism is able to distinguish between two environmental states. High discriminability manifests in an ROC curve by pulling the curve towards the upper left‐hand corner, where the organism is able to respond appropriately at much better than random rates. When discriminability is zero, the organism’s response probabilities are equivalent regardless of context. Discriminability is thus not about how often the organism responds to the cue per se , but rather how well it can identify the cue and use it to influence its response rates. For a given level of discriminability, it remains up to the organism whether it responds to the strongest 1% of stimuli it observes or the strongest 90%. This is where the second parameter, bias, comes in. Bias tells us how much information is required from a cue to induce a response – where the organism falls along the ROC curve determined by its sensory system ( Fig. 3 ). An organism’s optimal bias can be calculated as the point where the marginal benefit from increasing sensitivity in terms of increased detections (reduced Type 2 error) exactly balances the marginal cost from increased false alarms (increased Type 1 error). The organism’s actual bias is the log of the slope of the ROC curve at a particular point, which means ROC analysis can allow experimentalists to evaluate the perceived effects of Type 1 and Type 2 error for their organisms in novel environments ( Wickens 2001 ). With information about these two parameters, ecologists can make predictions about how organisms will respond to novel stimuli. For instance, organisms without an evolutionary history of dealing with predators are unlikely to discriminate well between predatory species introduced by humans and nonthreatening native species. Detection theory allows us to quantify this and to distinguish it from related hypotheses about bias (e.g. that predatory exotic species are perceived as different but are not perceived as especially dangerous). Well‐defended species that paid a heavy foraging cost for hiding may have a stronger bias against avoiding the predators they are able to detect, than lightly defended species that paid a lower cost in refuge. More generally, we expect that novel cues will show a wider range of discriminabilities than the cues with which the organism has coevolved, with some novel cues acting as supernormal stimuli and dominating animals’ decisions (high discriminability) and others (e.g. that were absent or unimportant during evolutionary history) being ignored entirely (little to no discriminability). Detection theory can also sharpen our intuition for how organisms will modify their behaviour in response to these changes in information quality. For example, consider two possible responses to an exotic predator that is more difficult to distinguish from nonpredators than its native counterpart. If prey become aware of their inability to detect the novel predator effectively and maintain their level of bias, then under these new circumstances, they will begin to flee habitats they would have previously considered safe to strike a better balance between missed detections and false alarms. Alternatively, if prey maintain their threshold, they avoid increased false alarms at the cost of increased predation rate. Both of these responses are depicted in Fig. 5 . Maintaining a constant threshold as discriminability declines pushes a species’ position on the ROC curve straight down. Alternatively, maintaining a constant bias pushes the species along a curved path towards the upper right or lower left corners, where it either always responds or never responds (negative or positive bias), regardless of the true state of the environment. These different strategies have fitness consequences: if exotic predators present the same danger as native ones (i.e., merit the same response bias), adjustments will allow prey to respond approximately optimally to these novel threats. Otherwise, prey may pay a foraging cost for nothing. 5 Here, the ability of the organism from Figs 3 and 4 to discriminate stimuli from nonstimuli decreases from 2 to 0.5. If the organism maintains the threshold intensity required to induce a response, its background response rate remains unchanged as it moves down through ROC space. If the organism instead takes its poorer discriminability into account and maintains a constant bias, it must adjust its response rates by following the curved arrow as described in the text. These two scenarios represent only a small sample of the possible outcomes that could be illuminated by detection theory. Cues vary along many axes ( Rowe 1999 ; Hebets and Papaj 2004 ), and detection theory provides techniques for assessing different components’ effects on discriminability ( Wiley 2006 ). More generally, ecologists could study how discriminability and bias change in different situations (e.g. different cue intensities, different relationships between multiple cues, different levels of background noise or available food). This could allow ecologists interested in HIREC to answer the questions above, and more. For instance, not only could we estimate an organism’s perceived costs of Type 1 and Type 2 errors in possible encounters with a predator, we could also see how those factors change with cue intensity, when cues are masked by pollutants or when other sources of resources or stress are varied. We could also evaluate whether the changes are adaptive given the information available or if the organism could do better. Finally, models can address how detection might evolve across generations (e.g. Oaten et al. 1975 ), which could help ecologists make longer‐term predictions about the effects of HIREC. Stage 3: Effectiveness of postevaluation responses After detecting, recognizing, and evaluating a novel situation, organisms still face the challenge of exhibiting an appropriate response. For instance, recognizing that a non‐native predator is dangerous is a necessary, but not sufficient, step to ensure prey survival. To survive, prey must also respond appropriately to the non‐native predator. Some studies have documented inappropriate prey‐escape responses to novel predators. For example, native water voles in Europe have an innate fear of introduced American mink and respond by hiding in burrows. However, this response is ineffective against female minks that are small enough to get inside the burrow, causing water voles to still suffer heavy predation ( MacDonald and Harrington 2003 ). As with prey evaluation of predators, we predict that the similarity of novel predators to predators that prey have experienced in the past should be critical; however, in this stage, the important issue should be the predator’s foraging/attack mode and thus the prey’s appropriate escape response ( Sih et al. 2010 ). Predators that use an attack mode that is new to the naïve prey should be most dangerous. For example, while going under rocks and burrowing into the substrate can be an effective response for snails against predatory fish, crayfish readily forage under rocks and in the substrate. When crayfish invade areas that have had fish but not crayfish, some snails respond to chemical cues from crayfish, but because they respond inappropriately (by going under rocks and burrowing in the substrate), they still suffer high predation rates (J. Stapley, B. C. Ajie and A. Sih, unpublished data). As with detection/recognition, a second key issue is whether prey use generalized or specialized antipredator responses ( Lima 1992 ; Matsuda et al. 1994 ; Sih et al. 1998, 2010 ). Examples of specialized prey responses include microhabitat shifts or escape behaviours that are highly effective against some predators, but unfortunately increase susceptibility to another species ( Kotler et al. 1992 ; Warkentin 1995 ; Sih et al. 1998 ; Relyea 2003 ). For example, mayflies that flee bottom‐foraging stonefly predators by entering the water column experience an increased chance of fish predation ( Soluk and Collins 1988 ). Although prey might have evolved to adaptively balance the conflicting demands of responding to multiple native predators, it would not be surprising if prey often exhibit inappropriate specialized responses to an exotic predator. A generalized response might then be favoured even if it is less effective than a given specialized defence, if it is at least somewhat effective against most predators. An example of a generalized antipredator response might be reduced prey activity (along with hiding in refuge) that might generally reduce predator encounter rates. Prey that rely on more generalized antipredator behaviours may be more likely to respond effectively to a novel predator. Parallel issues arise with other forms of HIREC. For example, naïve herbivores may recognize that novel plants are a potential resource, but this is only the first step in adopting the new host. Novel plants may lack deterrents and present chemical attractants that induce naïve adult herbivores to oviposit on them, but whether that herbivore successfully adopts the new plant depends on whether the new host also has the correct attractants (and lacks deterrents) that induce larvae to feed, whether larvae have the correct physiology and biochemistry to thrive on the new plant and whether the new plant also provides safety and shelter from enemies and abiotic stressors. In essence, successful use of new hosts requires both the recognition that the new plants are potential hosts and a positive ‘preference–performance correlation’ ( Bossart 2003 ). Herbivores often exhibit a positive preference–performance correlation with plants from their evolutionary past ( Sih and Christensen 2001 ), but we would not be surprised to see mismatches – evolutionary traps – with novel hosts. As a generality, generalist herbivores might be likely to inappropriately use novel plants that they cannot handle, while most herbivores, particularly specialists, might often ignore plants that they can, in fact, thrive on. Learning Up to this point, our discussion has focused on variation in immediate behavioural responses to novel situations. Even if animals do not respond well immediately to a novel situation, as long as they survive the initial exposure, they have the opportunity to learn and thus improve their ability to cope with HIREC. Virtually, all animal species can learn, that is, change their patterns of response to external cues through experience. Many studies have shown that learning allows individuals to identify new food sources ( Galef 1988 ), new predators ( Brown and Chivers 2005 ), differentiate suitable from nonsuitable habitats or mates ( Dugatkin and Godin 1992 ) and even adjust their phenology ( Grieco et al. 2002 ). Hence, the ability of species to adjust their behaviour under new environmental conditions will greatly affect their success. Learning related to dangerous and potentially lethal situations (learned predator recognition and conditioned taste aversion) is widespread and usually highly efficient (one‐time learning) ( Garcia et al. 1966 ; Griffin 2004 ; Ferrari et al. 2010b ). Such learning may allow enhanced recognition of many potential predators and noxious food items to avoid in the future. However, the downside is that these learned responses are often generalized to similar predator/food items, potentially resulting in time wasted avoiding nondangerous stimuli or loss of opportunities to use valuable resources. Another point to consider with this type of learning is that while it allows for the recognition of novel stimuli, it does not necessarily provide any education on how to respond to them ( Sih et al. 2010 ). For example, most of the literature on antipredator responses of prey to novel predators has focused on the ability of prey to learn to recognize novel predators, but very little is known on the ability of these individuals to successfully avoid or evade predators. Learning to recognize an exotic predator is good, but not enough unless also paired with learning an effective way to avoid, escape or survive a predatory encounter. Whether this disconnect represents a research bias or an actual lack of connection between learning (information input) and behavioural repertoire (behavioural output) is unknown. Learning through trial and error (associative learning, operant conditioning, peak‐shifts) and problem solving, often used in a foraging context, is costly in time and energy, but necessary for the discovery of new locally adaptive behaviours ( Boyd and Richerson 1996 ). However, not all learning mechanisms allow for ongoing improvement within an individual lifetime. Imprinting, for example, often involves learned preferences that are acquired early in life, with no further adjustment later during a lifetime. In that case, trial‐and‐error adjustments can take place over multiple generations, but not during an individual’s lifetime. This type of learning seems unlikely to allow species to adapt well to rapidly changing, novel environmental conditions. For example, imprinting often forms the basis of crucial behaviours such as the ability to distinguish conspecifics from heterospecifics. While not all species rely on imprinting for mate/conspecific recognition, some certainly do (e.g. many birds). The level of sophistication of the cues used for conspecific recognition should reflect the amount of selection for reproductive isolating mechanisms experienced by the species. Those species that have evolved in a low‐biodiversity environment may use general conspecific cues, which should be effective as long as no other species possessing similar characteristics are encountered in the habitat. However, recent human‐induced range shifts or invasions have allowed new species to co‐occur, and this have sometimes resulted in hybridization and biodiversity loss, as was the case between introduced mallards and the native Hawaiian duck ( Rhymer & Simberloff 1996 ). Learning to ignore novel stimuli – habituating to novel yet nonthreatening cues – can also play a major role in determining which species can adapt to HIREC. Human disturbance associated, for example, with urbanization or eco‐tourism, is a well‐known source of stress that can lead to decreased fitness through reduced foraging, nest abandonment or decreased parental care. While some species have learned to ignore humans, others do not seem to habituate to increased human disturbance, which leads to dramatic decreases in fitness ( Kerley et al. 2002 ; Thomas et al. 2003 ; Yasue 2005 ). While individual learning allows individuals to improve their responses to novel environmental conditions, the population as a whole may benefit more from this individual discovery if a new behaviour is transmitted horizontally to other conspecifics and vertically to the next generation. Cultural transmission (e.g. social learning) allows for the spread of a new behaviour/strategy at lower cost, assuming the learned behaviour is adaptive and the learner is in fact properly copying the tutor ( Galef 1988, 2003 ). If environmental conditions change rapidly, game theory ( Boyd and Richerson 1996 ) predicts that the best population will be the ones that can ‘inherit acquired information’, through a mix of both individual learning – maintaining a source of new locally adaptive behaviours – and cultural transmission. Species with overlapping generations have the ability to vertically transfer acquired information, while species with discrete generations have not. Thus, more social species with overlapping generations and parental care [particularly with parent–offspring teaching ( Caro and Hauser 1992 )] might respond better to HIREC than less social species with discrete generations. The effect of evolutionary history on learning Evolutionary history has shaped learning not only in the sense that some animals have evolved to be generally better at learning than others but also in the sense that animals have evolved to learn more readily in some situations than others and have evolved to learn some specific tasks or associations more easily than others. Habitat heterogeneity appears to play an important role in selecting for species displaying those phenotypes. In highly variable environments, new conditions may call for new locally adaptive behaviours, and species having the best ability to find those locally adaptive solutions will be the ones most likely to survive and thrive in these altered environments. HIREC can be seen as new sources of heterogeneity or challenges for species. Thorndike (1935) pointed out that trial‐and‐error learning occurred fastest when animals were motivated, prepared to learn and paying attention to the relevant cues, and identified the importance of biologically prepared learning. This reflects the notion that all types of learning fit onto a preparedness continuum, ranging from prepared learning (the predisposed learning ability in animals) to contraprepared learning (mechanism that make learning difficult to occur). This bias in the learning ability of different species is directly seen as a result of their evolutionary history ( Seligman and Hager 1972 ) and explains the inability of some species to learn to respond to novel cues put in a novel evolutionarily context. For example, many migratory species have the ability to spatially shift their habitat preference if local environmental conditions are not optimal, but will rarely shift their timing of migration. Because of the predisposition to respond to spatial, and not temporal, cues in the face of climate change, it is possible that shifts in phenology will be observed as a spatial shift (e.g. breeding grounds moving south), rather than a temporal shift (e.g. breeding in the same place but delayed by 2 weeks). Intuitively, we might think that HIREC should favour the evolution of increased learning; however, the evolutionary forces behind selection for plastic learning are complex. For example, Grieco et al. (2002) showed that blue tits learned the seasonal timing of food peaks and adjusted their breeding accordingly. However, if they were provided with earlier or later food peaks, they laid their eggs earlier or later, respectively, the following year. This adjustment is favoured only if the conditions from 1 year hold for the following year (e.g. a warm year is followed by another warm year). If this is not the case (if a warm year is followed by a cold year), animals may learn and shift their phenotype inappropriately. Visser (2008) argues that learning of this sort has evolved as a response to spatial variation, where there is strong year‐to‐year consistency. Two general points about evolutionary history and learning are that: (i) animals that evolved to learn and adjust their behaviour in response to predictable environments will likely exhibit inappropriate adjustments when exposed to environments where cues do not predict future conditions well; and (ii) animals that evolved in inconsistent, unpredictable conditions will likely not learn and adjust using environmental cues, even if they are now in environments where learning should be favoured. In addition, even if conditions are predictable enough to favour learning, if collecting information is too costly (e.g. if sampling is dangerous; Sih 1992 ), then learning might not be favoured. Multiple stressors, multiple traits and multiple responses Above, we examined aspects of HIREC one at a time. However, in reality, organisms usually face the substantially more complex challenge of coping simultaneously with multiple stressors associated with multiple aspects of HIREC. Species declines are often caused by the combined negative impacts of these multiple stressors. Amphibian declines, for example, have been associated with habitat loss, a barrage of novel enemies (diseases, predators and parasites), contaminants and climate change ( Blaustein and Kiesecker 2002 ; Blaustein and Bancroft 2007 ). Even if tadpoles exhibit adaptive responses to one novel stressor, they may decline because of poor performance relative to another (e.g. Rogell et al. 2009 ). Worse yet, the various novel anthropogenic stressors can have negative synergistic effects with each other and with natural stressors. For example, while many tadpoles have adaptations to cope with invertebrate predators and can cope physiologically with low concentrations of pesticides, when they are exposed to both, they show particularly poor survival ( Relyea and Mills 2001 ; Rohr et al. 2006 ). Along similar lines, an adaptive response to one stressor can expose organisms to another stressor that then causes declines. In response to exotic goats that cause habitat degradation, a species of lark has shifted to feeding in human habitats where they suffer increased exposure to disease ( Carrete et al. 2009 ). That is, in many cases, animals face conflicting demands from multiple stressors. To explain why some organisms cope better than others with multiple stressors, we thus need a better understanding of multiple traits and responses to the different stressors and how these multiple responses interact. Ghalambor et al. (2007) called this the mosaic nature of plasticity and evolution. For example, for salmon, an adaptive response to climate change requires plasticity in timing of migration, spawning, egg‐juvenile growth rates, thermal tolerance and disease resistance, with a possibility of conflicting selection pressures in different life history stages ( Crozier et al. 2008 ). Given that evolution might have shaped an adaptive, integrated, multi‐trait response to multiple natural stressors, when do we expect organisms to be able to co‐opt their evolved integrated phenotype to cope well with a novel mix of old and novel stressors? In particular, while animals might have evolved to do a good job of using a mix of responses to balance conflicting demands in their natural habitats ( Sih et al. 1998 ; Relyea 2001 ), their new challenge is to be able to recognize and evaluate cues from multiple stressors to produce an integrated, multi‐trait response that balances these multiple, often conflicting, demands. To date, there has been little work on this more complex response to HIREC. Behaviour, future evolution and effects on species persistence Up to this point, we have focused on immediate, short‐term behavioural responses to HIREC and their role in allowing species to get through the initial crunch. What about future evolution (of behaviour and other traits) and long‐term species persistence? With regard to ideas on the general issue of evolution of plastic responses to environmental change, Ghalambor et al. (2007) outlined several possibilities. If organisms immediately exhibit optimal behavioural responses to HIREC, then there should be stabilizing selection on behaviour and no need for further adaptive evolution. Alternatively, many animals appear to show adequate behavioural responses to HIREC, but with room for further directional selection and evolution. Models of this situation (e.g. Price et al. 2003 ) indicate that intermediate levels of plasticity are often best for avoiding long‐term extinction. This form of imperfect ‘pre‐adaptive’ plasticity has also been shown to be important empirically, as when Yeh and Price (2004) demonstrated that plastic changes in reproductive effort contribute substantially to maintaining positive population growth rates. Yet, other animals exhibit essentially maladaptive responses to HIREC (e.g. preferences for toxic habitats or food) that place species in danger of extinction. In these situations, populations presumably experience strong selection for improving their behavioural response to HIREC, but are also under risk of going extinct before they evolve adaptive responses. What do empirical data say about contemporary evolution of improved behavioural responses to HIREC? While there is a reasonably extensive literature on contemporary evolution, much of it in response to HIREC ( Strauss et al. 2006 ; Hendry et al. 2011 ; Lankau et al. 2011 ), relatively few studies have focused on the evolution of behavioural responses to HIREC. With regard to the more general issue of evolution of plasticity in response to environmental change, Crispo et al. (2010) recently reviewed 20 studies and found that different taxa have evolved either increases or decreases in plasticity in response to HIREC, indicating that the links between environmental change and evolutionary response are context‐dependent. Our focus, however, is not just on evolution of the degree of plasticity, but in particular on the pattern of plasticity – including both immediate behavioural responses to HIREC and longer‐term evolutionary changes in behaviour. In the context of artificial selection and domestication, it is well known that human‐induced changes in selection regimes can drive rapid behavioural evolution – e.g. rapid evolution of increased tameness, boldness or aggressiveness ( Price 1984 ; Conrad et al. in press ). More studies, however, are needed to look at how selection caused by other, inadvertent aspects of HIREC shapes behavioural evolution. In particular, despite some published theory (e.g. Price et al. 2003 ) and empirical examples ( Schlaepfer et al. 2005 , Visser 2008 ), more work is needed on the role of plasticity and plasticity evolution in shaping the overall response to HIREC that might facilitate long‐term species persistence in the face of HIREC. To give species time to evolve, several management strategies have been suggested for enhancing species persistence while they evolve in response to HIREC ( Schlaepfer et al. 2005 ). One possibility is to mitigate the negative impact of HIREC with spatial or temporal refugia that allow organisms to more effectively hide from or avoid novel environmental stressors. This could allow partially effective responses to evolve and also extend the time to extinction. Alternatively, managers might implement separate actions to increase population growth that could help stave off extinction and facilitate evolution (see Lankau et al. 2011 ). Finally, genetically or behaviourally savvy individuals from other populations could be introduced to provide useful phenotypic variants. There will rarely be a single prescription that works across contexts: species that have low genetic variation (a common problem after a population bottleneck associated with HIREC) might rely more heavily on plasticity to survive HIREC ( Dybdahl and Kane 2005 ; Strauss et al. 2006 ), while other species may be able to wait for evolution to run its course. Beyond the evolution of behavioural responses to HIREC per se , behaviour can also shape the overall evolutionary response to HIREC – evolution of both behaviour and other traits. One well‐recognized possibility is that ‘adaptive’ behaviour can compensate for other suboptimal traits. Even if animals are not well defended morphologically against predators, they may be able to compensate by hiding effectively, resulting in little or no selection favouring morphological evolutionary responses to predation ( DeWitt et al. 1999 ). Adaptive compensatory behaviour can thus slow the evolution of other traits in response to HIREC. Alternatively, behavioural compensation (and other forms of plasticity) can enhance the evolution of other traits via the Baldwin effect ( Baldwin 1896 ; Wcislo 1989 ; Crispo 2007 ), where adaptive behaviour compensates for other nonadaptive or maladaptive traits enough to allow these other traits to evolve. In theory, the process of genetic assimilation can then convert nonheritable plasticity into heritable variation that allows further evolution ( Price et al. 2003 ). Behavioural plasticity can not only allow species to persist better in their current (changed) habitat but can also facilitate colonization of new habitats, that is, it can enhance gene flow ( Crispo 2008 ), which then has further evolutionary effects. The colonization of new areas or niches can then result in either new opportunities for speciation ( West‐Eberhard 2003 ) or increased hybridization and breakdown of existing species barriers ( Taylor et al. 2006 ). Given that initial behaviour can produce such diverse and important evolutionary outcomes, there is clearly a need for a better, ideally predictive understanding of variation in behavioural responses to HIREC. The processes we have described here are often quite complex, and exploring them in individual systems can require multiple detailed studies. One good example is Visser’s (2008) discussion of the evolution of plastic responses to climate change, which relied on insights and data from multiple long‐term research programs studying different bird species and their environments, including temperature, diet, pedigree and other data. The pay‐off for such studies can be great, however; Visser was able to assess the relative importance of plasticity, selection, maternal effects and immigration in the systems he studies, and to put together, a cohesive picture of their likely responses to future climate change. This level of detail takes work, but it can improve both confidence and sophistication of our predictions. Evolution and species persistence can be tightly linked. Strong selection in the sense of high mortality can drive rapid evolution, but, as noted previously, it can also rapidly drive species towards extinction. In this scenario, common in the modern world, species will persist only if adaptive evolution is fast enough to save the species from extinction. Existing models examine factors that affect the likelihood that this will occur (e.g. Gomulkiewicz and Holt 1995 ). Recent models of these joint ecological and evolutionary dynamics have finally included plasticity ( Chevin and Lande 2009 ; Chevin et al. 2010 ). Future models of these dynamics incorporating more realistic behaviours and behavioural evolution should prove insightful, while further discussions with decision‐makers and applied ecologists (e.g. Schlaepfer et al. 2010 ) will contribute to our ability to influence these processes on the ground. Concluding remarks Our overall goal is to enhance our understanding of how evolutionary history has shaped animal sensory/cognitive systems to better predict which species will have problems coping with specific aspects of HIREC and conversely, which have the potential to become pests. For any given system, studies should: (i) focus on key limiting aspects of HIREC; (ii) identify key behaviours that explain the animal’s ability to cope with those novel, limiting factors ( Sih and Gleeson 1995 ); (iii) analyse the sensory/cognitive ecology underlying the key behaviours and (iv) define how evolutionary history might explain variation among and within species in both the key behaviours and their underlying sensory/cognitive ecology. Further development of a framework like detection theory is needed to give us quantitative tools for relating cues and cue use to responses to novel conditions. More comparative studies are needed to test ideas on how evolutionary history might shape sensory/cognitive ecology, learning and the resulting responses to HIREC. With regard to future evolution in response to HIREC, more data are needed on genetic variation within and between populations and selection on key behavioural responses to HIREC. Ideally, this would include information on genetic correlations with and selection on other traits in a multi‐stressor context. The goal would be to understand how the overall suite of traits was shaped by past evolution, how that suite explains both initial ability to cope with multiple aspects of HIREC and the ongoing evolution of an integrated response to ongoing change. Finally, more models and empirical studies are needed to better understand how behavioural plasticity influences both evolution and how evolution might help species escape extinction. Clearly, all of these are difficult intellectual challenges for evolutionary behavioural ecologists; however, they represent exciting, major contributions that the field can offer for applied evolution and ecology in the modern world. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Evolutionary Applications Wiley

Evolution and behavioural responses to human‐induced rapid environmental change

Loading next page...
 
/lp/wiley/evolution-and-behavioural-responses-to-human-induced-rapid-VUi9ksB60C

References (220)

Publisher
Wiley
Copyright
© 2011 Blackwell Publishing Ltd
ISSN
1752-4571
eISSN
17524571
DOI
10.1111/j.1752-4571.2010.00166.x
pmid
25567979
Publisher site
See Article on Publisher Site

Abstract

Introduction Almost all organisms on earth live in environments that have been altered, often drastically, by humans. Five major types of human‐induced environmental change have been identified: habitat loss/fragmentation, the spread of exotic species, harvesting by humans, pollutants of various sorts and, of course, climate change ( Rohr et al. 2006 ; Lockwood et al. 2007 ; IPCC 2007 ; Salo et al. 2007 ; Fabry et al. 2008 ). While it can be hard to precisely characterize the complex, multi‐dimensional nature of these environmental changes (e.g. in terms of their spatial scale, rapidity or evolutionary novelty) and while there are clearly differences between these five types of change, they share in common the fact that they are all important forms of human‐induced rapid environmental change (HIREC) that often put organisms into evolutionarily novel conditions that typically involve more rapid change than organisms have experienced in their evolutionary past ( Palumbi 2001 ). HIREC often alters species interactions and can cause species declines, including extinctions and range shifts ( Parmesan et al. 1999 ; Thomas and Lennon 1999 ; Walther et al. 2002 ; Thomas et al. 2004 ; Jackson and Sax 2010 ). These changes are, in turn, driving evolutionary changes, including adaptive evolutionary responses to HIREC, speciation and hybridization ( Hendry et al. 2011 ; Lankau et al. 2011 ). Some have projected that largely because of HIREC, a large proportion of the earth’s species will go extinct in the next 50–100 years ( Tilman 1994 ; Stork 2010 ). At the same time, however, many other species (e.g. invasive and urbanized pests) are thriving, even doing better than ever in the face of these same environmental changes. In many cases, species within the same genus, which seem identical until studied in detail, fall on opposite ends of this spectrum – with some species declining, threatened or endangered while others in the same genus are invasive pests (e.g. Rehage and Sih 2004 ; Rehage et al. 2005 ; D’Amore et al. 2009 ). A key issue is thus to understand, using an evolutionary and mechanistic perspective, why some species are doing so badly while others are doing so well in coping with HIREC. An integrative, evolutionary perspective is presented in Fig. 1 . In brief, the ability of species to cope with HIREC depends on the species’ traits (see Hendry et al. 2011 and Lankau et al. 2011 ). This idea is an extension of one of the most basic tenets of modern biology – that traits influence the fitness of individuals and success of species. If a species’ evolutionary history results in traits (both fixed and plastic) that are suitable for coping immediately with HIREC, then the species should be able to persist in the short term. If evolutionary history has also produced traits or conditions that facilitate a rapid evolutionary response (e.g. short generation times, suitable genetic variation in key traits; see Hendry et al. 2011 ), then the species can track HIREC by evolving new adaptations. Indeed, if a species has a long history of surviving rapid environmental changes and fluctuations, then human‐induced changes may not be particularly rapid or difficult to deal with by comparison. Thus, both the species’ evolutionary past (via its effect on current traits and genetic variation) and future evolution should influence its ability to persist in the long term. Given the number of large biotic and abiotic fluctuations and changes the biosphere has experienced since the emergence of life, and that all living organisms are descended from ancestors that weathered these changes, we can expect some resilience to change. The important questions are quantitative: how rapidly are humans altering the environment, and which species are likely to cope? 1 Past environments provide the evolutionary history that shapes sensory and cognitive processes controlling behaviour, as well as other traits and genetic variation. The fit of behaviour and other traits along with novel environments (that might match or mismatch past environments) influence individual fitness that governs population performance. Variations in fitness and genetic variation drive evolution that feeds back to determine future sensory and cognitive processes, behaviour, other traits and genetic variation. These, in turn, loop back to influence future fitness and population performance. Of the traits that influence success in response to HIREC, behavioural plasticity plays a particularly important role. A meta‐analysis of more than 3000 rates of recent phenotypic change suggested that most of the phenotypic changes observed in response to HIREC involve phenotypic plasticity rather than immediate genetic evolution ( Hendry et al. 2008 ). Furthermore, behaviour appears to be important in explaining variation in species’ abilities to cope well with HIREC, with maladaptive behaviours, such as consumption of novel toxic prey or failure to avoid novel predators, implicated in species declines ( Schlaepfer et al. 2002, 2005, 2010 ; Buchholz 2007 ; Sih et al. 2010 ) and more appropriate behavioural responses facilitating species invasions ( Holway and Suarez 1999 ; Sih et al. 2010 ). Behaviour also interacts with evolution in two important ways. First, following the framework in Fig. 1 , key behaviours have presumably been shaped by past evolution. Second, suitable behavioural plasticity can allow and shape future evolution of both the behaviour itself and other adaptive responses (e.g. life history or morphological responses) to HIREC ( Baldwin 1896 ; Wcislo 1989 ; West‐Eberhard 2003 ; Crispo 2007 ; Ghalambor et al. 2007 ). In essence, behaviour can facilitate the move from one adaptive peak (for past environments) to another (for new environments) or the ability of a species to track a shifting adaptive landscape ( Price et al. 2003 ; West‐Eberhard 2003 ). We next review examples of behavioural responses to some main categories of HIREC. We do not attempt to provide a comprehensive review of relevant literature. Instead, our goals in the next section are to: (i) introduce readers to some main issues and patterns; (ii) organize the diverse set of issues associated with HIREC into four main categories; and (iii) highlight the fact that for each type of HIREC, some organisms are not coping well, while others are thriving. After reviewing behavioural responses to HIREC, we then present a conceptual overview on how we might explain the variation in response to HIREC. Behavioural responses to HIREC Behavioural response to HIREC fall into several main categories: (i) avoiding or coping with novel enemies (e.g. predators, parasites, diseases; including humans); (ii) adopting and utilizing novel resources or habitats; (iii) avoiding or coping with novel abiotic stressors (e.g. pollutants); and (iv) adjusting to changing spatiotemporal conditions (e.g. habitat fragmentation, climate change). Below, we briefly review some examples of each with an emphasis on the variation in response (see Tuomainen and Candolin in press for a more comprehensive review of this literature). Some individuals or species exhibit maladaptive responses that can have serious detrimental consequences, while others show interestingly ‘adaptive’ responses despite the fact that HIREC is putting them into evolutionarily novel conditions. Risks and resources The spread of exotic species has exposed many animals to new enemies – new predators, parasites or diseases ( Lockwood et al. 2007 ). As with many other changes discussed here, humans did not invent this form of change, but globalization means that species can invade more rapidly and from more distant sites than before ( Hulme 2009 ). In some cases, the key new species is humans themselves. Animals sometimes show strikingly maladaptive responses to these novel enemies and thus suffer heavy mortality ( Cox and Lima 2006 ; Sih et al. 2010 ). Notable examples include the lack of behavioural responses of island species (some of which are now extinct) to humans ( Fritts and Rodda 1998 ; Cox and Lima 2006 ), of marsupial prey to foxes in Australia ( Kinnear et al. 2002 ), of birds and other vertebrate prey to brown tree snakes ( Wiles et al. 2003 ) and of various aquatic prey to introduced predatory fish ( Knapp et al. 2001 ). In contrast, other prey have shown either innate recognition with suitable responses to novel predators (e.g. Peluci et al. 2008 ; Epp and Gabor 2008 ; Rehage et al. 2009 ) or rapid adaptive learning about novel predators including humans ( Knight et al. 1987 ). Similarly, some animals do not appear to respond appropriately to exotic diseases (e.g. do not avoid diseased conspecifics and do not reduce aggregation that can elevate transmission rates, Han et al. 2008 ), while other animals do change their behaviour in ways that reduce disease transmission ( Behringer et al. 2006 ). The aforementioned examples involve inappropriately weak responses to dangerous enemies. The flip side also applies – animals can over‐respond to novel organisms that are not actually particularly dangerous. For example, many species apparently treat humans as potential predators ( Frid and Dill 2002 ; Beale and Monaghan 2004 ) even when humans pose little risk. Examples include numerous species probably over‐avoiding habitats used by humans, including areas used for eco‐tourism and human recreation ( Brown and Stevens 1997 ; Gander and Ingold 1997 ; Dyck and Baydack 2004 ; Gilroy and Sutherland 2007 ; Andersen and Aaars 2008 ). Still other species fail to exploit suitable habitat after humans extirpate their predators ( Blumstein 2006 ). A related phenomenon is the adoption of novel resources provided directly or indirectly by humans. For example, crops or ornamental plants represent a massive influx of often high‐quality food for herbivores. Notably, while some herbivores shift to utilize these novel plants and thus in some cases become pests, many other potential herbivores (e.g. herbivorous insects found in the same areas, often feeding on related plants) do not use the new hosts ( Samways and Lockwood 1998 ; Bossart 2003 ). Similarly, some gulls use garbage dumps, and some birds and squirrels avidly visit bird feeders ( McKinney 2002 ), while others do not. The adoption of new hosts by some parasites, but not others, can depend on differences in behavioural plasticity (e.g. Bush 2009 ). Of course, some novel, apparent resources are actually toxic and should be avoided. For example, some predators are suffering negative impacts as a result of consuming exotic, toxic cane toads in Australia ( Hagman et al. 2009 ). These negative impacts can reflect not just an inappropriate tendency to consume exotic, toxic prey, but also inappropriate behavioural means of handling toxic prey. Adders, for example, are well adapted to handling toxic frogs by waiting for their toxins to degrade; however, this tactic is ineffective against cane toads because their toxins degrade slowly ( Hagman et al. 2009 ). Habitat loss and fragmentation Habitat loss (habitat degradation, fragmentation) including urbanization/suburbanization, deforestation and conversion of wildlands to agriculture is likely the most important cause of species declines and extinctions ( Pimm and Raven 2000 ). Besides reducing habitat availability per se , habitat loss and fragmentation also cascade through species interactions, leading to increased competition ( MacDonald et al. 2004 ) or increased predation ( Schlaepfer et al. 2002 ) in the remaining habitat. Sources of fragmentation (e.g. roads – Laurance et al. 2008 ; Shepard et al. 2008 ) also impose barriers to adaptive movement through the landscape, which can reduce fitness. It is worth noting, however, that except when the habitat change is catastrophic (e.g. replacing a forest with a shopping mall), many forms of habitat change can actually provide new habitat for some taxa. Adoption of this new habitat (e.g. moving into urban/suburban areas) can be critical for long‐term species persistence. For example, use of human‐created stormwater ponds or artificial wetlands can be crucial for persistence of amphibians ( Brand and Snodgrass 2010 ). For species willing to move in with us, urban/suburban habitat often offers reduced predation risk and high food availability ( Gilroy and Sutherland 2007 ). As noted previously, crop fields represent a bonanza of high‐quality food. Given that habitat change can often be either good or bad for any given species depending on how they respond to the novel habitat, a key issue is to understand the variation in response. Why have some species become urbanized, while others have not ( Blair 2001 ; Sol et al. 2002 ; Hamer and McDonnell 2008 )? Pollutants We use the term ‘pollutant’ in a broad sense that includes chemical contaminants, but also changes in other abiotic conditions such as noise or light levels. Each of these has been shown to have adverse effects on organisms, often mediated by behaviour. While much of the work on chemical contaminants has focused on lethal effects or substantial developmental disruptions associated with relatively high chemical concentrations, much lower concentrations of metals or pesticides and other endocrine disruptors can alter a diverse array of behaviours, including predator–prey behaviours, mating and social behaviours, communication and learning ( Clotfelter et al. 2004 ; Leduc et al. 2004 ; Zala and Penn 2004 ; Rohr et al. 2004 ; Fisher et al. 2006 ). Indeed, just as predators have major impacts on communities both through predation per se and nonconsumptive effects (e.g. because of costly shifts in prey behaviour; Preisser et al. 2005 ), chemical contaminants have impacts both through direct lethal effects and via changes in behaviour ( Rohr et al. 2006 ). Species (and genotypes) differ in their ability to cope physiologically (e.g. differences in LC50s or LD50s) and behaviourally (e.g. Andres et al. 2000 ; McComb et al. 2008 ). Human activities often change environmental light levels (i.e., light pollution). On land, artificial light can cause sea turtle hatchlings to fatally move inland rather than to the ocean ( Tuxbury and Salmon 2005 ) and can result in reduced activity (and thus reduced foraging success) at night ( Kotler 1984 ; Baker and Richardson 2006 ), or shifts in the timing of activity ( Miller 2006 ). In the water, human activities often cause increased turbidity, which can reduce feeding rates ( Stuart‐Smith et al. 2004 ; Ljunggren and Sandstrom 2007 ) and disrupt mating patterns ( Seehausen et al. 1997 ; Candolin 2009 ). Importantly, this anthropogenic change can either have negative or positive impacts; for example, increased turbidity can provide enhanced safety for prey ( Vogel and Beauchamp 1999 ; Ferrari et al. 2010a ). Finally, humans make noise, which can have detrimental effects on animals via, for example, disruptions in acoustic communication. Notably, some (but not all) animals change their calls or calling patterns in an apparent attempt to maintain effective communication despite the disruptive background noise ( Miller et al. 2000 ; Slabberkoorn and Peet 2003 ; Foote et al. 2004 ; Sun and Narins 2005 ). Climate change and shifts in timing Among the many ways that global warming can impact organisms is the need to shift the seasonal timing of major life history events. With global warming, spring arrives earlier than before. Accordingly, many animals have shifted to begin breeding earlier, often by arriving at breeding grounds earlier in the spring (e.g. Forister and Shapiro 2003 ; Berteaux et al. 2004 ; Both et al. 2004 ; Pulido 2007 ; Lyon et al. 2008 ). Notably, for many taxa, these changes in life history timing are so rapid that they almost certainly represent primarily behavioural plasticity rather than evolutionary change ( Grieco et al. 2002 ; Gienapp et al. 2008 ). However, as with the other responses to HIREC discussed previously, not all animals are responding with suitable shifts. For example, studies have documented variation both between and within species in propensity or rate of shift in seasonal timing of reproduction ( Lyon et al. 2008 ; Reed et al. 2009 ). Animals that have not made an appropriate shift can suffer substantial population declines ( Both et al. 2006 ; Ludwig et al. 2006 ). Another notable effect of climate change is the effect of warming on sex ratio in taxa that exhibit temperature‐dependent sex determination. In many reptiles, global warming could result in highly biased sex ratios that could negatively influence species persistence (e.g. Mitchell et al. 2008 ). A few studies have found that animals have responded to this risk by shifting their nest site choice (e.g. by preferring shaded nest sites in warmer regions; Doody et al. 2006 ). However, it is unclear whether other species are changing maternal behaviour rapidly enough to keep up with climate change ( Morjan 2003 ). The aforementioned examples illustrate the striking variation in the way species respond to human‐altered environments. The ability to display appropriate behaviours is crucial in determining immediate fitness and perhaps short‐term persistence in the face of HIREC. The key issue then is to better understand why some animals behave appropriately under novel conditions, while others do not. Thus, in the following sections, we (i) suggest a mechanistic, sensory ecological approach for studying and understanding variation in behavioural responses to HIREC, emphasizing evolutionary history and especially the match or mismatch between past and new environments; (ii) present a theoretical framework (detection theory) for making detailed predictions about how organisms should respond to novel stimuli and for evaluating them experimentally; (iii) discuss how learning can affect responses to HIREC when organisms’ initial behaviours are inappropriate and note the role of evolutionary history on this aspect of environmental change; and (iv) note some complexities that might constrain the ability of species (or individuals) to exhibit suitable responses to HIREC. Finally, we discuss how behavioural responses to HIREC might affect longer‐term evolution and how both behaviour and evolution might influence long‐term species persistence. Explaining variation in behavioural responses to HIREC A conventional wisdom is that behavioural flexibility per se helps species cope with HIREC. For example, the literature on traits associated with invasive species (which are thriving with HIREC) suggests that successful invaders might often be generalists with high phenotypic plasticity and behavioural flexibility ( Lodge 1993 ; Richards et al. 2006 ; Ghalambor et al. 2007 ). Indeed, experimental studies comparing closely related taxa that are invasive versus restricted in range have found evidence that invasive species are more flexible in their response to novel foods, predators or competitors (e.g. Rehage and Sih 2004 ; Rehage et al. 2005 ). In addition, broad, comparative analyses show that invasiveness in birds and mammals appears to be related to behavioural flexibility that is, in turn, associated with larger brains ( Sol 2005 ; Sol et al. 2008 ). Even more broadly, species or individuals that respond well to HIREC might either have a ‘personality type’ or behavioural syndrome (flexible, exploratory, bold, or aggressive behavioural tendencies) that makes them better suited to coping with novel conditions ( Sih et al. 2004 ; Cote et al. in press ) or might exhibit greater within‐species variation in behavioural tendencies that allows the species to cope well with environmental variation ( Fogarty et al., in press ). Conversely, species with low flexibility (e.g. specialists) appear to be often highly vulnerable to HIREC ( Colles et al. 2009 ). Another common idea is that an organism’s evolutionary history plays an important role in explaining its response to novel conditions. At one level, differences between species in behavioural flexibility (or in plasticity, in general) presumably depend on the species’ evolutionary history. Species that have evolved with high spatiotemporal variability should be more likely to be plastic ( Mayr 1974 ; Walther et al. 2002 ; Gabriel et al. 2005 ). With regard to particular traits and particular aspects of the organism’s evolutionary history, a key might be the match (or mismatch) between the new environment and the organisms’ traits that were shaped by past selection ( Ghalambor et al. 2007 ; Sih et al. 2010 ). For example, Blair (2001) suggested that (i) ‘urban avoiders’ that are sensitive to human disturbance might be species from habitats that are least like urban areas (e.g. old forests), (ii) ‘urban adapters’ are species that previously used forest edges along with associated open areas and have thus evolved the flexibility to use a diverse range of habitats, and (iii) ‘urban exploiters’ tend to be generalist omnivores that are ‘pre‐adapted’ to live in human structures (e.g. rats, mice, pigeons). Along parallel lines, Fahrig (2007) suggested that understanding patterns of patch quality in a species’ evolutionary past could help us predict their abilities to cope with habitat fragmentation. Species that evolved with low‐risk matrix habitats (areas between main habitats) move readily between patches and are thus susceptible to high mortality while moving through matrix habitat that is now much more risky owing to habitat degradation. Conversely, species that evolved with high‐risk matrix habitats avoid moving between habitat patches and are thus susceptible to low immigration/colonization success in newly fragmented habitats. A key challenge for using evolutionary history to explain behavioural responses to HIREC, however, is the fundamental difficulty of predicting how organisms should respond to novel conditions that they have not seen in their evolutionary past. One simple option for models of plasticity in ‘extreme environments’ is to assume linear extrapolation of the reaction norm ( Chevin and Lande 2009 ). Alternatively, Ghalambor et al. (2007) noted that, while a reaction norm could be held taut like a string by selection across the range of current and relevant past environments, there is little more than pleiotropy holding it in place outside this range. Because of lack of selection, we have little a priori basis for predicting what phenotype will be expressed in novel conditions. While this view provides the insight that many systems might harbour hidden, ‘pre‐adaptive’ genetic variation that can facilitate future adaptive evolution in response to HIREC, it makes few concrete predictions about why some species exhibit immediate appropriate behavioural responses to HIREC, while others do not. For this reason, we suggest that it will be valuable to apply a more mechanistic approach incorporating sensory and cognitive ecology of behavioural responses. The sensory/cognitive ecology of behavioural responses to HIREC We organize our discussion using the common approach in behavioural ecology of analysing behavioural processes as the outcome of three sequential stages: encounter (stage 1), detect, recognize and evaluate (stage 2), and respond (stage 3). Differences in overall response to HIREC can be understood by looking at variation (among individuals, populations or species) in exposure and response to HIREC in each of these three stages. While some organisms are not suffering negative impacts from HIREC because, by chance or choice, they simply do not encounter it, we focus on organisms that are, in fact, exposed to HIREC. What factors explain why some understand (detect, recognize and suitably evaluate) novel conditions, while others do not? For those that appear to recognize a novel stressor, why do some respond appropriately while others do not? Why do some prey recognize and respond appropriately to exotic predators while other prey do not? Why do some consumers recognize and successfully utilize new resources while others do not? Finally, how and why have some organisms, but not others, successfully evaluated climate change and shifted their phenologies accordingly? The behavioural approach that we espouse draws from the literature on evolutionary traps ( Schlaepfer et al. 2002, 2005, 2010 ; Robertson and Hutto 2006 ; Gilroy and Sutherland 2007 ; Part et al. 2007 ). The basic idea is that organisms have evolved cue–response relationships that are adaptive in their natural environments. Following Cosmides and Tooby (1987) , Schlaepfer et al. (2005, 2010) refer to these cue–response, decision‐making rules as Darwinian algorithms. Animals use these cue–response algorithms to evaluate habitat quality, food quality, danger, or the appropriate time to begin breeding and respond accordingly. A problem arises, however, if under novel conditions, the previously adaptive cue–response relationship now results in a misevaluation of the environment or an inappropriate response. An ecological trap is the particular case of an evolutionary trap that involves habitat use – where organisms choose poor‐quality habitats (sinks) over better available habitats because of errors in the evaluation of habitat quality. Many examples of traps involve maladaptive habitat use (see Schlaepfer et al. 2002 ). Grassland birds nest in pastures that get mowed before chicks can fledge. Vultures are electrocuted when they perch on electric lines. Insects of various sorts attempt to oviposit on concrete or glass buildings that have visual properties that resemble that of water ( Kriska et al. 2008 ). Sea turtle hatchlings move away from their suitable ocean habitat when they follow human lights at night ( Tuxbury and Salmon 2005 ). Other examples involve attraction to inappropriate foods. Sea turtles die when they consume plastic refuse. Humans crave fatty foods. Yet other examples involve attraction to inappropriate mates. A classic, widely seen photograph shows a male beetle with its genitalia extended, apparently trapped into attempting to mate with a brown beer bottle. Using a similar principle, biological control programs use sex pheromones as an evolutionary trap to kill pest insects. Other organisms, however, respond appropriately to novel cues. Our goal is to develop a conceptual framework for thinking about how evolutionary history produces cue–response relationships that explain relative ability to cope with HIREC. We split our analysis into the two relevant stages stated above (assuming that organisms have indeed encountered HIREC). Stage 2 involves understanding the cues that organisms use to evaluate environmental conditions, and stage 3 focuses on how organisms then respond to those cues. Stage 2: Detecting, recognizing and evaluating novel environmental conditions Our sensory/cognitive framework for explaining variation in how organisms evaluate a novel situation includes the following key points. Animals are more likely to respond to a ‘novel’ cue if it is similar to a cue that its ancestors responded to in the evolutionary past. We refer to this as the ‘cue similarity’ hypothesis (see Fig. 2A ). Of course, ‘similarity’ is relative: whether a particular cue elicits a response depends, among other things, on whether organisms use broad, generalized criteria ( Fig. 2C ) as opposed to more specialized, precise criteria ( Fig. 2B ) for evaluating environmental conditions. While the notion of understanding how animals ‘think about’ their environments may seem suspect to many ecologists, it is a main topic of study for animal behaviourists and is often crucial for properly understanding behaviour ( Shettleworth 2001 ; DeWaal 2008 ). 2 Influence of cue similarity, and use of general versus specialized cues, on recognition of novel cues. Shown are two‐dimenstional cue spaces. E = cues produced by a stimulus from a species’ evolutionary past; N = cues produced by a novel stimulus. The circle or oval around each E is the cue space that elicits a response. (A) The new cue is similar to cues from the species’ past, and the focal species uses specific cues to elicit a response. The species recognizes the novel stimulus. (B) The new cue is not similar to cues from the past, and the species uses specific cues. The species does not recognize the novel stimulus. (C) New and past cues are dissimilar, but because the species uses general cues, it recognizes the new stimulus. (D) Prey recognition of a predator depends on how they use multiple cues. Prey could be alarmed by either A or B (above a threshold level for either) or might require cue A and B to be alarmed. Adapted from Sih et al. (2010) . To flesh out an example of the cue similarity hypothesis, consider the issue of prey responses to novel predators. Invasive, exotic predators often have major negative impacts on naïve prey apparently because these prey often exhibit weak antipredator responses ( Cox and Lima 2006 ; Sih et al. 2010 ). Other naïve prey, however, respond appropriately enough to survive encounters with predators that they have never seen before – neither within their lifetime nor in their known evolutionary past. To explain this variation in prey response to novel predators, a first, and perhaps obvious, point is that prey response to novel predators should depend on whether the novel predators present cues that are similar to cues from predators that the organisms have experienced in their evolutionary past. Although the ecologist may know, for example, that a particular predatory fish is novel, if prey already coexist with a similar predatory fish, prey will likely show adaptive antipredator responses (e.g. Ferrari et al. 2007 ). In contrast, when exotic predatory fish invade (or are released) into habitats that lack any predatory fish, prey often show little or no adaptive response. Several studies indeed show that the magnitude of prey response to cues from novel predators is proportional to the similarity of the novel predators to native predators ( Griffin et al. 2001 ; Ferrari et al. 2007 ; Stankowich and Coss 2007 ). While cue similarity might often be related to taxonomic similarity, this need not always be true. Predators that are taxonomically related might put out quite different cues. For example, some prey avoid large, actively searching predators ( Dill 1974 ; Sih 1986 ). These prey might not respond adaptively to a closely related, but exotic, smaller ambush predator. Conversely, predators that are distantly related might release similar cues. Prey that avoid large, active native predators should also respond well to even distantly related exotic predators that have a similar appearance and predation style. The ‘cue similarity’ hypothesis clearly invokes an important role for evolutionary history and the match versus mismatch between past environments and the novel situations associated with HIREC. If prey have evolved with similar predators, they are likely to recognize a taxonomically exotic predator. Indeed, if prey use generalized cues to assess risk and if they have evolved with any predators, they appear to be pre‐adapted to recognize a broad range of exotic predators ( Blumstein 2006 ). Cox and Lima (2006) took this a step further to suggest that a large‐scale pattern like the fact that freshwater prey tend to be more susceptible than terrestrial prey to negative impacts from exotic predators may be explained by general habitat‐driven differences in their evolutionary history with predators. Many freshwater habitats are ephemeral and/or fragmented (e.g. ephemeral or isolated ponds). Prey in these habitats typically lack an evolutionary history with vertebrate predators and, thus, often show little effective response to the introduction of exotic predatory fish. In contrast, until recently, terrestrial prey have lived in less ephemeral, less fragmented habitats that have generally contained important predators. As a result, these prey more often recognize and respond well to exotic predators. Beyond cue similarity per se , prey responses to particular novel cues depend also on whether prey use broad, general cues as opposed to narrow, precise, specific cues to gauge risk. An example of a general cue is a chemical cue released by damaged conspecific (or heterospecific) prey ( Chivers and Smith 1998 ; Ferrari et al. 2010b ). Prey that use these general ‘alarm cues’ will respond to any ‘sloppy’ predators, including exotic ones. However, prey that use general cues to identify risk might also inappropriately exhibit antipredator responses to nonpredatory sources of damage. Somewhat more specific, but still quite broad, is the response of many aquatic prey to fish chemical cues (e.g. Binckley and Resetarits 2003 ; Sih et al. 2003 ). Although prey using these cues should respond to exotic predatory fish, they might also respond unnecessarily to nonpredatory fish ( Langerhans and DeWitt 2002 ). Other general cues include avoidance of any large moving animal ( Dill 1974 ; Sih 1986 ; Wisenden and Harter 2001 ) or avoidance of particular habitats even without direct predator cues ( Verdolin 2006 ). In contrast to such general cues, many prey respond to either a more specific cue (e.g. Kotler et al. 1991 ; Jedrzejewski et al. 1993 ; Thorson et al. 1998 ) or a mixture of multiple cues (e.g. simultaneous detection of chemical cues from specific predators and damaged prey – Sih 1986 ; Chivers et al. 2002 ; Schoeppner and Relyea 2005 ; Brodin et al. 2006 – or both chemical and visual cues from a specific predator – Amo et al. 2004 ). Prey that rely on more specific cues or that require both cue A and cue B to elicit an antipredator response (see Fig. 2D ) should be more vulnerable to being ‘trapped’ into not responding to an exotic predator. Notably, studies have found within‐species variation in how prey respond to the same risk‐related cues ( Sih et al. 2003 ; Brodin et al. 2006 ). Quantifying cue similarity between exotic and native predators, and the variation in the type and precision of cues used by native prey, should help explain variation in immediate responses to exotic predators. To emphasize, in this stage, we are only looking at factors that might explain whether organisms respond to novel situations or not. We will consider variation in response suitability in a subsequent section. Evolutionary history can also help us understand why some organisms use general versus specific cues. Again, we use prey responses to exotic predators as a format for explaining some general ideas ( Sih 1992 ; Sih et al. 2010 ). Prey that use more general cues are not only more likely to respond to a novel predator but also more likely to respond unnecessarily to nondangerous stimuli. In contrast, prey that rely on more specific cues are less likely to waste time and effort with unnecessary responses, but run the risk of not responding to an actually dangerous novel predator. The balance between these competing selection pressures depends in part on their relative benefits and costs. If in their evolutionary history, prey have had effective means of escaping attack by their native predators, then the cost of using specific cues, and thus ignoring potential danger until the last second, has been relatively low. The benefit of using specific cues should be particularly large if the costs of over‐responding to risk are high (e.g. if food is scarce and only found outside of refuges). Thus, under these conditions, we expect prey to evolve the use of specific cues for evaluating risk. In contrast, prey that have difficulty escaping predators should favour more general cues because they cannot afford to make the mistake of under‐responding to predators. More generally, in an uncertain environment with imprecise cues, asymmetries in the costs and benefits of under‐responding versus over‐responding should help explain the use of general versus specific cues. Our basic sensory/cognitive framework for stage 2 can also be applied to other issues about responses to HIREC. For example, all around the world, humans have provided large amounts of novel resources for herbivores in the form of crops or ornamental plants. Only a small proportion of all the herbivores that encounter these novel plants shift to utilize them. Those that have made the shift sometimes become economically important pests. Although many studies have focused on the ecology of crop pests, surprisingly few studies have examined why some herbivores use a particular crop while others, sometimes closely related herbivores, do not (but see Samways and Lockwood 1998 ). A better understanding of this issue could be useful for pre‐emptive pest management. Our framework suggests that a first step is to look at the cues herbivores use to detect and recognize a plant as a suitable host. A large literature shows that herbivore diet choice often depends on a complex blend of chemical (and in some cases, visual or textural) attractants and deterrents ( Dethier 1980 ; Futuyma and Moreno 1988 ). Interestingly, if they lack the appropriate attractants or mix of attractants, herbivores often ignore plants that they could thrive on ( Dethier 1980 ; Bruce et al. 2005 ). Our framework predicts that herbivores should be more likely to shift to use novel plants if: (i) the novel plants share similar attractants/deterrents as hosts already used by a given herbivore; or (ii) the herbivore is a generalist in the sense of having broad, catholic criteria for plant acceptability as opposed to requiring a specific, narrow set of cues (see Fig. 2 ). These points may seem obvious; however, the framework still has value in focusing attention on behavioural mechanisms and provides a good starting point for further refinement in particular systems. More broadly, the basic framework of focusing on both cue similarity and generalized/specialized use of cues can help us understand variation in adoption of any novel potential resource – good or bad. For example, it can help us understand which parasites shift to use novel hosts, which predators consume exotic prey including toxic ones like cane toads and which consumers are susceptible to being ‘trapped’ into eating inappropriate foods like plastic garbage. The framework also suggests issues to study to explain why some organisms use human‐altered habitats and why others do not ( Gilroy and Sutherland 2007 ). Finally, a cue‐based approach can help explain variation in response to climate change. For example, while some birds have shifted their seasonal timing of breeding, others have not ( Lyon et al. 2008 ). A simple cue‐based idea is that organisms that rely primarily on photoperiod cues to set their seasonal timing will not show a plastic response to changing temperatures, while species that use temperature per se , or a combination of temperature and photoperiod cues, should exhibit a more rapid plastic response. Evolutionary history, including both adaptation and the possibility of nonadaptive phylogenetic inertia, can help us understand variation among species in their cue–response systems relative to seasonal timing of reproduction ( Hahn and MacDougall‐Shackleton 2008 ). A quantitative framework for evaluating responses to novel cues Above, we outlined general concepts regarding the importance of various features of cues and understanding how they translate from the environments in organisms’ evolutionary histories to those affected by HIREC. We next outline a framework that empiricists and theoreticians can use to quantify these concepts: signal detection theory, originally proposed by Tanner and Swets (1954) . Because this theory has been extended well beyond the interpretation of signals per se (e.g. when psychologists study memory), following Macmillan and Creelman (2005) , we simply refer to the body of theory as ‘detection theory’. This theory has helped ecologists clarify a number of issues, from trade‐offs involved in foraging ( Rechten et al. 1983 ) and antipredator behaviour ( Ings and Chittka 2008 ) to the maintenance of phenotypic plasticity ( Getty 1996 ). It has also helped address a broad range of questions that play a major role in understanding responses to HIREC: Does pollution change response rates by overwhelming the animal’s sensory systems or by changing the perceived costs and benefits of behaviours ( Bushnell 1997 )? How are cues integrated ( Massaro and Friedman 1990 )? How do changes in cue intensity affect detection/recognition versus response ( Terman 1970 )? What components of the sensory and decision‐making process change after learning ( Friedman et al. 1968 )? What roles do inattention and fatigue play ( Benjamin et al. 2009 )? How do the organisms seem to weigh the relative costs of Type 1 and Type 2 error ( Getty and Krebs 1985 )? Thus, by helping us model and evaluate animals’ decisions, detection theory can also help explain and predict variation in responses to HIREC. Detection theory provides two sets of methods of interest to ecologists studying responses to HIREC: a family of statistical modelling techniques that enable inferences about animals’ decision‐making processes from experimental data, and a way of determining optimal behaviour and estimated fitness under information constraints. Ecologists often use this second form of detection theory on its own ( Getty 1996 ; Rodríguez‐Gironés and Lotem 1999 ; Trimmer et al. 2008 ), but it is most powerful when combined with experiments that show how organisms actually behave (e.g. Getty and Krebs 1985 ). In detection theory models, the organism’s response is ultimately determined by where perceived cue intensity falls in relation to one or more thresholds. After repeatedly exposing organisms to different cues (or to different combinations of cues) under different conditions and observing their overall response rates, the researcher then plots a receiver operating characteristic (ROC) curve through the observed response rates ( Fig. 3 ). The shape of this curve allows us to infer two parameters – discriminability and bias – that describe decision‐making mechanisms ( Fig. 4 ). 3 Three receiver operating characteristic (ROC) curves with discriminabilities of 0, 0.5 and 2. When discrimination is impossible (discriminability = 0), stimuli cannot affect behaviour, and the rate of successful detections equals the background response rate (1:1 line). As discriminability improves, these two rates can diverge and the ROC curve bows up and to the left. Organisms’ response probabilities are also influenced by their response bias – the level of confidence required to induce a response – which depends on the slope. The ‘X’ marks the organism considered in Figs 4 and 5 , with discriminability = 2 and bias = −0.4. 4 The inferred distributions of perceived intensities from stimuli (right curve) and nonstimuli (left curve) for the organism marked in Fig 3 . Discriminability is the relative distance between the curves and corresponds to low overlap, while bias is the strength of evidence required to provoke a response, corresponding to the relative height of the curves. The hatched areas under each curve correspond to the organism’s response rate for the corresponding scenario (i.e., its x and y coordinates in ROC space). Discriminability tells us how well the organism is able to distinguish between two environmental states. High discriminability manifests in an ROC curve by pulling the curve towards the upper left‐hand corner, where the organism is able to respond appropriately at much better than random rates. When discriminability is zero, the organism’s response probabilities are equivalent regardless of context. Discriminability is thus not about how often the organism responds to the cue per se , but rather how well it can identify the cue and use it to influence its response rates. For a given level of discriminability, it remains up to the organism whether it responds to the strongest 1% of stimuli it observes or the strongest 90%. This is where the second parameter, bias, comes in. Bias tells us how much information is required from a cue to induce a response – where the organism falls along the ROC curve determined by its sensory system ( Fig. 3 ). An organism’s optimal bias can be calculated as the point where the marginal benefit from increasing sensitivity in terms of increased detections (reduced Type 2 error) exactly balances the marginal cost from increased false alarms (increased Type 1 error). The organism’s actual bias is the log of the slope of the ROC curve at a particular point, which means ROC analysis can allow experimentalists to evaluate the perceived effects of Type 1 and Type 2 error for their organisms in novel environments ( Wickens 2001 ). With information about these two parameters, ecologists can make predictions about how organisms will respond to novel stimuli. For instance, organisms without an evolutionary history of dealing with predators are unlikely to discriminate well between predatory species introduced by humans and nonthreatening native species. Detection theory allows us to quantify this and to distinguish it from related hypotheses about bias (e.g. that predatory exotic species are perceived as different but are not perceived as especially dangerous). Well‐defended species that paid a heavy foraging cost for hiding may have a stronger bias against avoiding the predators they are able to detect, than lightly defended species that paid a lower cost in refuge. More generally, we expect that novel cues will show a wider range of discriminabilities than the cues with which the organism has coevolved, with some novel cues acting as supernormal stimuli and dominating animals’ decisions (high discriminability) and others (e.g. that were absent or unimportant during evolutionary history) being ignored entirely (little to no discriminability). Detection theory can also sharpen our intuition for how organisms will modify their behaviour in response to these changes in information quality. For example, consider two possible responses to an exotic predator that is more difficult to distinguish from nonpredators than its native counterpart. If prey become aware of their inability to detect the novel predator effectively and maintain their level of bias, then under these new circumstances, they will begin to flee habitats they would have previously considered safe to strike a better balance between missed detections and false alarms. Alternatively, if prey maintain their threshold, they avoid increased false alarms at the cost of increased predation rate. Both of these responses are depicted in Fig. 5 . Maintaining a constant threshold as discriminability declines pushes a species’ position on the ROC curve straight down. Alternatively, maintaining a constant bias pushes the species along a curved path towards the upper right or lower left corners, where it either always responds or never responds (negative or positive bias), regardless of the true state of the environment. These different strategies have fitness consequences: if exotic predators present the same danger as native ones (i.e., merit the same response bias), adjustments will allow prey to respond approximately optimally to these novel threats. Otherwise, prey may pay a foraging cost for nothing. 5 Here, the ability of the organism from Figs 3 and 4 to discriminate stimuli from nonstimuli decreases from 2 to 0.5. If the organism maintains the threshold intensity required to induce a response, its background response rate remains unchanged as it moves down through ROC space. If the organism instead takes its poorer discriminability into account and maintains a constant bias, it must adjust its response rates by following the curved arrow as described in the text. These two scenarios represent only a small sample of the possible outcomes that could be illuminated by detection theory. Cues vary along many axes ( Rowe 1999 ; Hebets and Papaj 2004 ), and detection theory provides techniques for assessing different components’ effects on discriminability ( Wiley 2006 ). More generally, ecologists could study how discriminability and bias change in different situations (e.g. different cue intensities, different relationships between multiple cues, different levels of background noise or available food). This could allow ecologists interested in HIREC to answer the questions above, and more. For instance, not only could we estimate an organism’s perceived costs of Type 1 and Type 2 errors in possible encounters with a predator, we could also see how those factors change with cue intensity, when cues are masked by pollutants or when other sources of resources or stress are varied. We could also evaluate whether the changes are adaptive given the information available or if the organism could do better. Finally, models can address how detection might evolve across generations (e.g. Oaten et al. 1975 ), which could help ecologists make longer‐term predictions about the effects of HIREC. Stage 3: Effectiveness of postevaluation responses After detecting, recognizing, and evaluating a novel situation, organisms still face the challenge of exhibiting an appropriate response. For instance, recognizing that a non‐native predator is dangerous is a necessary, but not sufficient, step to ensure prey survival. To survive, prey must also respond appropriately to the non‐native predator. Some studies have documented inappropriate prey‐escape responses to novel predators. For example, native water voles in Europe have an innate fear of introduced American mink and respond by hiding in burrows. However, this response is ineffective against female minks that are small enough to get inside the burrow, causing water voles to still suffer heavy predation ( MacDonald and Harrington 2003 ). As with prey evaluation of predators, we predict that the similarity of novel predators to predators that prey have experienced in the past should be critical; however, in this stage, the important issue should be the predator’s foraging/attack mode and thus the prey’s appropriate escape response ( Sih et al. 2010 ). Predators that use an attack mode that is new to the naïve prey should be most dangerous. For example, while going under rocks and burrowing into the substrate can be an effective response for snails against predatory fish, crayfish readily forage under rocks and in the substrate. When crayfish invade areas that have had fish but not crayfish, some snails respond to chemical cues from crayfish, but because they respond inappropriately (by going under rocks and burrowing in the substrate), they still suffer high predation rates (J. Stapley, B. C. Ajie and A. Sih, unpublished data). As with detection/recognition, a second key issue is whether prey use generalized or specialized antipredator responses ( Lima 1992 ; Matsuda et al. 1994 ; Sih et al. 1998, 2010 ). Examples of specialized prey responses include microhabitat shifts or escape behaviours that are highly effective against some predators, but unfortunately increase susceptibility to another species ( Kotler et al. 1992 ; Warkentin 1995 ; Sih et al. 1998 ; Relyea 2003 ). For example, mayflies that flee bottom‐foraging stonefly predators by entering the water column experience an increased chance of fish predation ( Soluk and Collins 1988 ). Although prey might have evolved to adaptively balance the conflicting demands of responding to multiple native predators, it would not be surprising if prey often exhibit inappropriate specialized responses to an exotic predator. A generalized response might then be favoured even if it is less effective than a given specialized defence, if it is at least somewhat effective against most predators. An example of a generalized antipredator response might be reduced prey activity (along with hiding in refuge) that might generally reduce predator encounter rates. Prey that rely on more generalized antipredator behaviours may be more likely to respond effectively to a novel predator. Parallel issues arise with other forms of HIREC. For example, naïve herbivores may recognize that novel plants are a potential resource, but this is only the first step in adopting the new host. Novel plants may lack deterrents and present chemical attractants that induce naïve adult herbivores to oviposit on them, but whether that herbivore successfully adopts the new plant depends on whether the new host also has the correct attractants (and lacks deterrents) that induce larvae to feed, whether larvae have the correct physiology and biochemistry to thrive on the new plant and whether the new plant also provides safety and shelter from enemies and abiotic stressors. In essence, successful use of new hosts requires both the recognition that the new plants are potential hosts and a positive ‘preference–performance correlation’ ( Bossart 2003 ). Herbivores often exhibit a positive preference–performance correlation with plants from their evolutionary past ( Sih and Christensen 2001 ), but we would not be surprised to see mismatches – evolutionary traps – with novel hosts. As a generality, generalist herbivores might be likely to inappropriately use novel plants that they cannot handle, while most herbivores, particularly specialists, might often ignore plants that they can, in fact, thrive on. Learning Up to this point, our discussion has focused on variation in immediate behavioural responses to novel situations. Even if animals do not respond well immediately to a novel situation, as long as they survive the initial exposure, they have the opportunity to learn and thus improve their ability to cope with HIREC. Virtually, all animal species can learn, that is, change their patterns of response to external cues through experience. Many studies have shown that learning allows individuals to identify new food sources ( Galef 1988 ), new predators ( Brown and Chivers 2005 ), differentiate suitable from nonsuitable habitats or mates ( Dugatkin and Godin 1992 ) and even adjust their phenology ( Grieco et al. 2002 ). Hence, the ability of species to adjust their behaviour under new environmental conditions will greatly affect their success. Learning related to dangerous and potentially lethal situations (learned predator recognition and conditioned taste aversion) is widespread and usually highly efficient (one‐time learning) ( Garcia et al. 1966 ; Griffin 2004 ; Ferrari et al. 2010b ). Such learning may allow enhanced recognition of many potential predators and noxious food items to avoid in the future. However, the downside is that these learned responses are often generalized to similar predator/food items, potentially resulting in time wasted avoiding nondangerous stimuli or loss of opportunities to use valuable resources. Another point to consider with this type of learning is that while it allows for the recognition of novel stimuli, it does not necessarily provide any education on how to respond to them ( Sih et al. 2010 ). For example, most of the literature on antipredator responses of prey to novel predators has focused on the ability of prey to learn to recognize novel predators, but very little is known on the ability of these individuals to successfully avoid or evade predators. Learning to recognize an exotic predator is good, but not enough unless also paired with learning an effective way to avoid, escape or survive a predatory encounter. Whether this disconnect represents a research bias or an actual lack of connection between learning (information input) and behavioural repertoire (behavioural output) is unknown. Learning through trial and error (associative learning, operant conditioning, peak‐shifts) and problem solving, often used in a foraging context, is costly in time and energy, but necessary for the discovery of new locally adaptive behaviours ( Boyd and Richerson 1996 ). However, not all learning mechanisms allow for ongoing improvement within an individual lifetime. Imprinting, for example, often involves learned preferences that are acquired early in life, with no further adjustment later during a lifetime. In that case, trial‐and‐error adjustments can take place over multiple generations, but not during an individual’s lifetime. This type of learning seems unlikely to allow species to adapt well to rapidly changing, novel environmental conditions. For example, imprinting often forms the basis of crucial behaviours such as the ability to distinguish conspecifics from heterospecifics. While not all species rely on imprinting for mate/conspecific recognition, some certainly do (e.g. many birds). The level of sophistication of the cues used for conspecific recognition should reflect the amount of selection for reproductive isolating mechanisms experienced by the species. Those species that have evolved in a low‐biodiversity environment may use general conspecific cues, which should be effective as long as no other species possessing similar characteristics are encountered in the habitat. However, recent human‐induced range shifts or invasions have allowed new species to co‐occur, and this have sometimes resulted in hybridization and biodiversity loss, as was the case between introduced mallards and the native Hawaiian duck ( Rhymer & Simberloff 1996 ). Learning to ignore novel stimuli – habituating to novel yet nonthreatening cues – can also play a major role in determining which species can adapt to HIREC. Human disturbance associated, for example, with urbanization or eco‐tourism, is a well‐known source of stress that can lead to decreased fitness through reduced foraging, nest abandonment or decreased parental care. While some species have learned to ignore humans, others do not seem to habituate to increased human disturbance, which leads to dramatic decreases in fitness ( Kerley et al. 2002 ; Thomas et al. 2003 ; Yasue 2005 ). While individual learning allows individuals to improve their responses to novel environmental conditions, the population as a whole may benefit more from this individual discovery if a new behaviour is transmitted horizontally to other conspecifics and vertically to the next generation. Cultural transmission (e.g. social learning) allows for the spread of a new behaviour/strategy at lower cost, assuming the learned behaviour is adaptive and the learner is in fact properly copying the tutor ( Galef 1988, 2003 ). If environmental conditions change rapidly, game theory ( Boyd and Richerson 1996 ) predicts that the best population will be the ones that can ‘inherit acquired information’, through a mix of both individual learning – maintaining a source of new locally adaptive behaviours – and cultural transmission. Species with overlapping generations have the ability to vertically transfer acquired information, while species with discrete generations have not. Thus, more social species with overlapping generations and parental care [particularly with parent–offspring teaching ( Caro and Hauser 1992 )] might respond better to HIREC than less social species with discrete generations. The effect of evolutionary history on learning Evolutionary history has shaped learning not only in the sense that some animals have evolved to be generally better at learning than others but also in the sense that animals have evolved to learn more readily in some situations than others and have evolved to learn some specific tasks or associations more easily than others. Habitat heterogeneity appears to play an important role in selecting for species displaying those phenotypes. In highly variable environments, new conditions may call for new locally adaptive behaviours, and species having the best ability to find those locally adaptive solutions will be the ones most likely to survive and thrive in these altered environments. HIREC can be seen as new sources of heterogeneity or challenges for species. Thorndike (1935) pointed out that trial‐and‐error learning occurred fastest when animals were motivated, prepared to learn and paying attention to the relevant cues, and identified the importance of biologically prepared learning. This reflects the notion that all types of learning fit onto a preparedness continuum, ranging from prepared learning (the predisposed learning ability in animals) to contraprepared learning (mechanism that make learning difficult to occur). This bias in the learning ability of different species is directly seen as a result of their evolutionary history ( Seligman and Hager 1972 ) and explains the inability of some species to learn to respond to novel cues put in a novel evolutionarily context. For example, many migratory species have the ability to spatially shift their habitat preference if local environmental conditions are not optimal, but will rarely shift their timing of migration. Because of the predisposition to respond to spatial, and not temporal, cues in the face of climate change, it is possible that shifts in phenology will be observed as a spatial shift (e.g. breeding grounds moving south), rather than a temporal shift (e.g. breeding in the same place but delayed by 2 weeks). Intuitively, we might think that HIREC should favour the evolution of increased learning; however, the evolutionary forces behind selection for plastic learning are complex. For example, Grieco et al. (2002) showed that blue tits learned the seasonal timing of food peaks and adjusted their breeding accordingly. However, if they were provided with earlier or later food peaks, they laid their eggs earlier or later, respectively, the following year. This adjustment is favoured only if the conditions from 1 year hold for the following year (e.g. a warm year is followed by another warm year). If this is not the case (if a warm year is followed by a cold year), animals may learn and shift their phenotype inappropriately. Visser (2008) argues that learning of this sort has evolved as a response to spatial variation, where there is strong year‐to‐year consistency. Two general points about evolutionary history and learning are that: (i) animals that evolved to learn and adjust their behaviour in response to predictable environments will likely exhibit inappropriate adjustments when exposed to environments where cues do not predict future conditions well; and (ii) animals that evolved in inconsistent, unpredictable conditions will likely not learn and adjust using environmental cues, even if they are now in environments where learning should be favoured. In addition, even if conditions are predictable enough to favour learning, if collecting information is too costly (e.g. if sampling is dangerous; Sih 1992 ), then learning might not be favoured. Multiple stressors, multiple traits and multiple responses Above, we examined aspects of HIREC one at a time. However, in reality, organisms usually face the substantially more complex challenge of coping simultaneously with multiple stressors associated with multiple aspects of HIREC. Species declines are often caused by the combined negative impacts of these multiple stressors. Amphibian declines, for example, have been associated with habitat loss, a barrage of novel enemies (diseases, predators and parasites), contaminants and climate change ( Blaustein and Kiesecker 2002 ; Blaustein and Bancroft 2007 ). Even if tadpoles exhibit adaptive responses to one novel stressor, they may decline because of poor performance relative to another (e.g. Rogell et al. 2009 ). Worse yet, the various novel anthropogenic stressors can have negative synergistic effects with each other and with natural stressors. For example, while many tadpoles have adaptations to cope with invertebrate predators and can cope physiologically with low concentrations of pesticides, when they are exposed to both, they show particularly poor survival ( Relyea and Mills 2001 ; Rohr et al. 2006 ). Along similar lines, an adaptive response to one stressor can expose organisms to another stressor that then causes declines. In response to exotic goats that cause habitat degradation, a species of lark has shifted to feeding in human habitats where they suffer increased exposure to disease ( Carrete et al. 2009 ). That is, in many cases, animals face conflicting demands from multiple stressors. To explain why some organisms cope better than others with multiple stressors, we thus need a better understanding of multiple traits and responses to the different stressors and how these multiple responses interact. Ghalambor et al. (2007) called this the mosaic nature of plasticity and evolution. For example, for salmon, an adaptive response to climate change requires plasticity in timing of migration, spawning, egg‐juvenile growth rates, thermal tolerance and disease resistance, with a possibility of conflicting selection pressures in different life history stages ( Crozier et al. 2008 ). Given that evolution might have shaped an adaptive, integrated, multi‐trait response to multiple natural stressors, when do we expect organisms to be able to co‐opt their evolved integrated phenotype to cope well with a novel mix of old and novel stressors? In particular, while animals might have evolved to do a good job of using a mix of responses to balance conflicting demands in their natural habitats ( Sih et al. 1998 ; Relyea 2001 ), their new challenge is to be able to recognize and evaluate cues from multiple stressors to produce an integrated, multi‐trait response that balances these multiple, often conflicting, demands. To date, there has been little work on this more complex response to HIREC. Behaviour, future evolution and effects on species persistence Up to this point, we have focused on immediate, short‐term behavioural responses to HIREC and their role in allowing species to get through the initial crunch. What about future evolution (of behaviour and other traits) and long‐term species persistence? With regard to ideas on the general issue of evolution of plastic responses to environmental change, Ghalambor et al. (2007) outlined several possibilities. If organisms immediately exhibit optimal behavioural responses to HIREC, then there should be stabilizing selection on behaviour and no need for further adaptive evolution. Alternatively, many animals appear to show adequate behavioural responses to HIREC, but with room for further directional selection and evolution. Models of this situation (e.g. Price et al. 2003 ) indicate that intermediate levels of plasticity are often best for avoiding long‐term extinction. This form of imperfect ‘pre‐adaptive’ plasticity has also been shown to be important empirically, as when Yeh and Price (2004) demonstrated that plastic changes in reproductive effort contribute substantially to maintaining positive population growth rates. Yet, other animals exhibit essentially maladaptive responses to HIREC (e.g. preferences for toxic habitats or food) that place species in danger of extinction. In these situations, populations presumably experience strong selection for improving their behavioural response to HIREC, but are also under risk of going extinct before they evolve adaptive responses. What do empirical data say about contemporary evolution of improved behavioural responses to HIREC? While there is a reasonably extensive literature on contemporary evolution, much of it in response to HIREC ( Strauss et al. 2006 ; Hendry et al. 2011 ; Lankau et al. 2011 ), relatively few studies have focused on the evolution of behavioural responses to HIREC. With regard to the more general issue of evolution of plasticity in response to environmental change, Crispo et al. (2010) recently reviewed 20 studies and found that different taxa have evolved either increases or decreases in plasticity in response to HIREC, indicating that the links between environmental change and evolutionary response are context‐dependent. Our focus, however, is not just on evolution of the degree of plasticity, but in particular on the pattern of plasticity – including both immediate behavioural responses to HIREC and longer‐term evolutionary changes in behaviour. In the context of artificial selection and domestication, it is well known that human‐induced changes in selection regimes can drive rapid behavioural evolution – e.g. rapid evolution of increased tameness, boldness or aggressiveness ( Price 1984 ; Conrad et al. in press ). More studies, however, are needed to look at how selection caused by other, inadvertent aspects of HIREC shapes behavioural evolution. In particular, despite some published theory (e.g. Price et al. 2003 ) and empirical examples ( Schlaepfer et al. 2005 , Visser 2008 ), more work is needed on the role of plasticity and plasticity evolution in shaping the overall response to HIREC that might facilitate long‐term species persistence in the face of HIREC. To give species time to evolve, several management strategies have been suggested for enhancing species persistence while they evolve in response to HIREC ( Schlaepfer et al. 2005 ). One possibility is to mitigate the negative impact of HIREC with spatial or temporal refugia that allow organisms to more effectively hide from or avoid novel environmental stressors. This could allow partially effective responses to evolve and also extend the time to extinction. Alternatively, managers might implement separate actions to increase population growth that could help stave off extinction and facilitate evolution (see Lankau et al. 2011 ). Finally, genetically or behaviourally savvy individuals from other populations could be introduced to provide useful phenotypic variants. There will rarely be a single prescription that works across contexts: species that have low genetic variation (a common problem after a population bottleneck associated with HIREC) might rely more heavily on plasticity to survive HIREC ( Dybdahl and Kane 2005 ; Strauss et al. 2006 ), while other species may be able to wait for evolution to run its course. Beyond the evolution of behavioural responses to HIREC per se , behaviour can also shape the overall evolutionary response to HIREC – evolution of both behaviour and other traits. One well‐recognized possibility is that ‘adaptive’ behaviour can compensate for other suboptimal traits. Even if animals are not well defended morphologically against predators, they may be able to compensate by hiding effectively, resulting in little or no selection favouring morphological evolutionary responses to predation ( DeWitt et al. 1999 ). Adaptive compensatory behaviour can thus slow the evolution of other traits in response to HIREC. Alternatively, behavioural compensation (and other forms of plasticity) can enhance the evolution of other traits via the Baldwin effect ( Baldwin 1896 ; Wcislo 1989 ; Crispo 2007 ), where adaptive behaviour compensates for other nonadaptive or maladaptive traits enough to allow these other traits to evolve. In theory, the process of genetic assimilation can then convert nonheritable plasticity into heritable variation that allows further evolution ( Price et al. 2003 ). Behavioural plasticity can not only allow species to persist better in their current (changed) habitat but can also facilitate colonization of new habitats, that is, it can enhance gene flow ( Crispo 2008 ), which then has further evolutionary effects. The colonization of new areas or niches can then result in either new opportunities for speciation ( West‐Eberhard 2003 ) or increased hybridization and breakdown of existing species barriers ( Taylor et al. 2006 ). Given that initial behaviour can produce such diverse and important evolutionary outcomes, there is clearly a need for a better, ideally predictive understanding of variation in behavioural responses to HIREC. The processes we have described here are often quite complex, and exploring them in individual systems can require multiple detailed studies. One good example is Visser’s (2008) discussion of the evolution of plastic responses to climate change, which relied on insights and data from multiple long‐term research programs studying different bird species and their environments, including temperature, diet, pedigree and other data. The pay‐off for such studies can be great, however; Visser was able to assess the relative importance of plasticity, selection, maternal effects and immigration in the systems he studies, and to put together, a cohesive picture of their likely responses to future climate change. This level of detail takes work, but it can improve both confidence and sophistication of our predictions. Evolution and species persistence can be tightly linked. Strong selection in the sense of high mortality can drive rapid evolution, but, as noted previously, it can also rapidly drive species towards extinction. In this scenario, common in the modern world, species will persist only if adaptive evolution is fast enough to save the species from extinction. Existing models examine factors that affect the likelihood that this will occur (e.g. Gomulkiewicz and Holt 1995 ). Recent models of these joint ecological and evolutionary dynamics have finally included plasticity ( Chevin and Lande 2009 ; Chevin et al. 2010 ). Future models of these dynamics incorporating more realistic behaviours and behavioural evolution should prove insightful, while further discussions with decision‐makers and applied ecologists (e.g. Schlaepfer et al. 2010 ) will contribute to our ability to influence these processes on the ground. Concluding remarks Our overall goal is to enhance our understanding of how evolutionary history has shaped animal sensory/cognitive systems to better predict which species will have problems coping with specific aspects of HIREC and conversely, which have the potential to become pests. For any given system, studies should: (i) focus on key limiting aspects of HIREC; (ii) identify key behaviours that explain the animal’s ability to cope with those novel, limiting factors ( Sih and Gleeson 1995 ); (iii) analyse the sensory/cognitive ecology underlying the key behaviours and (iv) define how evolutionary history might explain variation among and within species in both the key behaviours and their underlying sensory/cognitive ecology. Further development of a framework like detection theory is needed to give us quantitative tools for relating cues and cue use to responses to novel conditions. More comparative studies are needed to test ideas on how evolutionary history might shape sensory/cognitive ecology, learning and the resulting responses to HIREC. With regard to future evolution in response to HIREC, more data are needed on genetic variation within and between populations and selection on key behavioural responses to HIREC. Ideally, this would include information on genetic correlations with and selection on other traits in a multi‐stressor context. The goal would be to understand how the overall suite of traits was shaped by past evolution, how that suite explains both initial ability to cope with multiple aspects of HIREC and the ongoing evolution of an integrated response to ongoing change. Finally, more models and empirical studies are needed to better understand how behavioural plasticity influences both evolution and how evolution might help species escape extinction. Clearly, all of these are difficult intellectual challenges for evolutionary behavioural ecologists; however, they represent exciting, major contributions that the field can offer for applied evolution and ecology in the modern world.

Journal

Evolutionary ApplicationsWiley

Published: Mar 1, 2011

There are no references for this article.