Sweet Emotion: The Role of Odor-induced Context in the Search Advantage for Happy Facial Expressions

Sweet Emotion: The Role of Odor-induced Context in the Search Advantage for Happy Facial Expressions Abstract The current study investigated the extent to which the concurrent presentation of pleasant and unpleasant odors could modulate the perceptual saliency of happy facial expressions in an emotional visual search task. Whilst a search advantage for happy faces was found in the no odor and unpleasant odor conditions, it was abolished under the pleasant odor condition. Furthermore, phasic properties of visual search performance revealed the malleable nature of this happiness advantage. Specifically, attention towards happy faces was optimized at the start of the visual search task for participants presented with pleasant odors, but diminished towards the end. This pattern was reversed for participants in the unpleasant odor condition. These patterns occur through the emotion-inducing capacities of odors and highlight the circumstances in which top-down factors can override perceptually salient facial features in emotional visual search. Introduction Odors have been shown to centrally interact with a range of biological and cognitive processes (Bensafi et al. 2002; Li et al. 2007; Moss et al. 2008; Moss et al. 2010), including their potent ability in unlocking our seemingly forgotten memories (Chu and Downes 2000). Their subjective ratings of pleasantness and unpleasantness affect not only how we feel (Black 2001), but also the quality of our emotional attachments with other humans (Sookoian et al. 2011). Their ability to evoke approach and avoidance affective reactions helps to mobilize an organism for “fight or flight” action through the neural interconnections between the olfactory receptors and the brain’s emotion processing hub, the amygdala, which is located only one synapse away (Boesveldt et al. 2010). In humans, further pathways from the amygdala feed into an intricate network implicated for the visual analysis of faces in the occipitotemporal cortex (inferior occipital gyrus and superior temporal sulcus; Adolphs 2002; Damjanovic et al. 2017), thus allowing for the rapid, emotional appraisal of the most socially important visual cue in our environment—the human face. Whilst the insula, amygdala, primary sensory, and orbitofrontal cortical regions in the human brain are involved in the perception of aversive stimuli in all 5 sensory modalities, a particularly elusive issue is whether olfaction and face perception influence each other in emotion specific ways (Phillips and Heining 2002). Leppänen and Hietanen (2003) provide one of the first attempts to systematically test the level of specificity in olfactory-visual processing of affective information. In their study, healthy participants were required to complete a forced-choice decision task categorizing facial expressions of happiness and disgust taken from the Ekman and Friesen (1976) database. Each facial expression was individually presented at central fixation on the computer screen and participants were required to identify, by button press, as quickly as possible the emotional expression portrayed. In a between-subjects design, some of the participants performed the task whilst they were exposed to a pleasant odor whilst another group of participants performed the task whilst they were exposed to an unpleasant odor. A third group of participants also performed the task under neutral (i.e., no odor) conditions. Whilst there was an overall advantage for categorizing happy facial expressions, this varied as a function of the odor context, such that the categorization of happy faces was facilitated in the context of pleasant odor relative to the no odor control condition, but impaired when presented with an unpleasant odor. The different odor contexts did not affect the processing of facial expressions of disgust. These findings suggest that whilst some facial expressions may be easier to recognize on the basis of unique low-level features, such as the brightness of a smile in happy facial expressions, their perception is nonetheless affected by the context in which it is encountered. In the case of Leppänen and Hietanen’s (2003) findings, the authors propose that the improved recognition of happy faces in the pleasant odor context is achieved by increasing the accessibility of positive emotions, which in turn enhances the perceptual processing of emotion-congruent aspects of the facial signal. Building on this important work, Leleu et al. (2015) discovered that some emotional expressions are affected more strongly by different odor contexts than others. For instance, facial expressions of anger and disgust were perceived correctly at lower stimulus intensities when presented in an aversive odor context (i.e., butyric acid) than in both the pleasant (i.e., strawberry) and no odor contexts. The perception of happiness was achieved at lower stimulus intensities when presented in the pleasant odor context than in the control and aversive contexts. However, participants were not significantly influenced by the different odor contexts in their perceptual judgments of fear and sadness. Whilst the facilitative effects found for happy facial expressions paired with the pleasant odor are consistent with the view that odor contexts can improve access to conceptual and/or emotional structures of affective stimuli at ambiguous low-stimulus intensities, such access may not always operate in a category-specific way. Thus, in some instances aversive odor contexts can facilitate the perception of some negative emotional expressions such as anger and disgust, but not others, such as sadness and fear. However, investigations focusing on the neural basis of this response facilitation by odorant primes have produced somewhat equivocal results. For example, Seubert et al. (2010a, 2010b) utilized a repeated measures design for the administration of pleasant, unpleasant, and neutral odors whilst participants categorized happy, disgusted, and neutral facial expressions. In contrast to Leppänen and Hietanen’s findings, the study by Seubert and colleagues found facilitated response times for facial expressions of disgust irrespective of the emotional valence of the odorant prime. For happy faces, the effects of the different odorant primes was less consistent: resulting in nonsignificant effects on reaction times in some instances (2010a), yet in others reaction times to happy faces were considerably impaired for both pleasant and unpleasant relative to the neutral odorant (2010b). This behavioral facilitation for facial expressions of disgust corresponded to neural modulations in the fusiform gyrus, middle frontal, and middle cingulated gyrus, with category specific modulations found for disgust faces-unpleasant odor pairings in the anterior insula. Thus, whether the odorant is pleasant or unpleasant, its effect on vision appears to be highly specialized; facilitating the perception of social cues that literally convey “bad taste” (i.e., disgust). Determining whether odor contexts can modulate emotion perception in a category-specific manner is likely to be influenced by a range of factors, ranging from the experimental design and the dependent variables of interest within a given study (e.g., accuracy, response times, self-report ratings, etc.,) to the ontological properties of the odorants themselves. For instance, Zhou and Chen (2009) created their odor contexts from sweat samples collected from participants whilst they watched video segments selected to induce fear and found that participants were more likely to judge an ambiguous facial expression as displaying fear when they were exposed to the chemosignal of fearful sweat, as compared to the control pad. Thus, the perception of low intensity fearful expressions appears to be susceptible to odor facilitation when the context is created from fear-related chemosensory stimuli with socio-communicative functions (i.e., body odors) rather than common odors. This may partly be due to differential processing between common odors and body odors, with research by Lundström et al. (2008) showing how body odors activate brain networks consisting of the posterior cingulate cortex, occipital gyrus, angular gyrus, and the anterior cingulate cortex—a network typically implicated in processing of emotional stimuli and the regulation of attentional resources (e.g., Botvinick et al. 1999; Maddock 1999), whilst deactivating other regions that have previously been linked to olfactory perception of common odors (e.g., piriform cortex and orbitofrontal cortex). Whilst a number of methodological issues could account for the discrepancy in results between the work of Leppänen and Hietanen (2003) and Seubert and colleagues (2010a, 2010b), a key issue that both studies agree on is the need to test the effects of odorant primes under complex face processing tasks. Asking participants to categorize a single facial expression presented at a fixed central location is not likely to exert a particularly demanding constraint on attentional resources, especially when response categories are explicitly primed (see also Leleu et al. 2015). Indeed, such experimental tasks result in ceiling levels of performance which are likely to mask any contextual effects provided by the odorant primes. To further clarify the role of odorant primes on face processing, it is important to investigate how they affect the spatial distribution of attentional resources. This is an important issue to address given that a considerable amount of our everyday attentional processing for facial expressions occurs in the context of surrounding facial expressions. The obvious ecological appeal to studying how we detect positive and negative facial expressions in a crowd of faces has been measured with the face in the crowd effect paradigm (FICE). Modelled on classic principles of visual search (e.g., Treisman and Gelade 1980), the FICE paradigm involves the presentation of a target face against an array of competing distracter faces on the computer screen. On some trials all the faces in the display show the same emotional expression, whereas in others, one face differs in emotion from the remaining faces in the crowd. Participants are instructed to discriminate between the “same” or “different” trials via a response key. The main independent variable of interest is the manipulation of the target face on these “different” display trials. Using response time and accuracy to detect the discrepant face in the display, some of the earliest FICE findings showed that participants were faster and more accurate in shifting their attentional resources towards the target face when it portrayed an angry facial expression than a happy one (e.g., Hansen and Hansen 1988; Öhman et al. 2001). Referred to in the FICE literature as the anger or the threat superiority effect (e.g., Fox and Damjanovic 2006; Pinkham et al. 2010), this detection advantage is often attributed to an evolutionarily-driven neural mechanism that enables rapid deployment of attentional resources to stimuli that signal immediate danger and attack in the observer’s visual environment (Öhman et al. 2001; Öhman and Mineka 2001) which can be heightened even further through threat-relevant training (e.g., Damjanovic et al. 2014). Thus, when attentional resources are relatively fixed, as in the categorization tasks used by Leppänen and Hietanen, happy faces show a processing advantage over negative expressions such as disgust. However, under greater attentional competition, as measured by the FICE, angry faces not happy ones yield the processing advantage. The predominance of threat superiority findings using FICE has however waned in recent years. This has been mainly due to an increasing number of studies documenting how happy faces, not angry ones, are detected faster and with greater accuracy, yielding what is referred to in the literature as the happiness advantage or the happiness superiority effect (e.g., Juth et al. 2005; Calvo and Nummenmaa 2008; Lipp et al. 2009; Damjanovic et al. 2010; Damjanovic and Santiago 2016). Although more and more studies are trying to increase the ecological validity of FICE tasks by using stimuli of photographic facial expressions from established databases rather than schematic drawings, it appears that most of the inconsistencies found in such studies can be accounted for by the presence or absence of low level facial features found across different database types (e.g., Savage et al. 2013). Recently referred to as the “teeth visibility hypothesis” (Horstmann et al. 2012), this perceptual account states that the presence of exposed teeth is a salient facial feature that drives the advantage of happy faces over angry faces, so much so that systematic manipulations of this facial component can reliably predict which target face is detected most efficiently: when happy facial expressions are conveyed with a “toothy grin” whilst angry faces are conveyed with a closed mouth, happy faces are detected more efficiently. Conversely, when angry facial expressions are conveyed with a “toothy snarl” whilst happy faces are conveyed with a closed smile, angry faces are detected more efficiently. However, further studies on this specific issue have questioned the extent to which a perceptual account can exclusively accommodate the detection advantage for happy facial expressions. Using computer-generated facial expressions of anger and happiness and embedding them within the FICE, Becker and colleagues (2011) reported more efficient detection times for happy face targets even when the amount of perceptual information was identical between angry and happy faces. Providing the following evolutionary and affective accounts, Becker et al. argue that happy facial expressions have evolved to be highly visually salient in our environment, as a means of alerting us to important social affiliation cues required to facilitate group membership and integration. Happy facial expressions therefore have become serviceable for the specific purpose of signaling friendship under a range of circumstances including their detection across long distances (e.g., Hager and Ekman 1979) and in instances when the emotion is less intensely expressed (Becker et al. 2011; Becker et al. 2012). As such, happy faces are encountered in our lifetime with greater frequency in our social environment relative to negative emotions (e.g., Bond and Siddle 1996; Whalen 1998), and in turn biases our expectancy for positive outcomes over negative ones (e.g., Diener and Diener 1996; Chipchase and Chapman 2007). A direct consequence of such frequency effects is that positively laden affective information becomes preferentially processed over negative information. However, when the competing negative affect is overly arousing, any attentional bias towards happy faces may diminish, opening up the prioritization of threat-specific cues, such as angry faces, instead. To summarize, categorization tasks focus on emotion perception under fixed attentional demands whereas FICE tasks are mainly concerned with how attention is distributed across several facial expressions. However, both methodologies have attracted considerable theoretical debate in terms of whether the processing of facial expressions of emotion can be more appropriately accounted for by perceptual-based explanations or affective ones. The current study makes a new contribution to this area by using the contextual cues created by odors which have been extensively applied in categorization tasks, but never included in tasks measuring spatial attention performance with emotional faces. The main aim of the study is to investigate for the first time the effects of different odorant primes on the happiness superiority effect using the FICE task. The present study As noted in the above review, the happiness superiority effect can to a large extent be determined by the type of stimuli used in FICE display trials (e.g., Juth et al. 2005; Becker et al. 2011; Becker et al. 2012). The FICE task used in the current study was selected to satisfy 2 important aims: to elicit a consistent happiness superiority effect within the participant sample recruited for the study and to be sufficiently complex in task demands to allow for the different odorant primes to take effect (e.g., Leppänen and Hietanen 2003; Seubert et al. 2010a, 2010b). The FICE task developed in some of our earlier work satisfies these criteria, demonstrating a robust happiness superiority effect by native English-speaking Caucasian participants across 3 experiments, although for some variants of the FICE task, the detection of happy face targets was easier than others. The current study used the “crowd” variant of Damjanovic et al.’s FICE task, using angry, happy and neutral face stimuli taken from the Caucasian set of the Matsumoto and Ekman’s Japanese and Caucasian Facial Expressions of Emotion (JACFEE) database. Developed in 1988, the database was validated by using the Facial Action Coding System (FACS), a technique that enables the objective measurement of facial muscle innervations specific to the emotion portrayed (Ekman et al. 2002). This allowed the facial expressions to be carefully matched for signal clarity and intensity across the different emotional categories (Matsumoto 2002). The happy facial expressions in Matsumoto and Ekman’s database were posed with “toothy grins”, whilst all of the angry face exemplars were posed with a downward shut mouth, thus the happiness superiority effect found in Damjanovic et al.’s study with their Caucasian participants could be accounted for in terms of the teeth visibility hypothesis (Horstmann et al. 2012). The key research question addressed in the current study is whether such a perceptual based explanation of the happiness superiority effect measured by FICE can operate independently of an affectively valenced environment. Specifically, we hypothesized that if the underlying mechanism of the happiness superiority effect is a perceptual one, the effect should remain stable across the different odor contexts (Juth et al. 2005; Calvo and Nummenmaa 2008; Calvo and Marrero 2009; Damjanovic et al. 2010; Becker et al. 2011). This hypothesis was addressed by comparing participants’ FICE performance in a no odor (i.e., control) group with participants who performed the task under different affectively valenced environments created by the concurrent presentation of pleasant or unpleasant odorant primes. The experiment used a between-subjects design and long-term odorant exposure in order to assess whether exposure to the odorants influenced the emotional state of the participant during the FICE task. Whilst obtaining faster detection times for happy face targets in the pleasant odor condition would be consistent with the findings obtained in Leppänen and Hietanen’s (2003) work, the exact mechanism for such odor effects remains elusive. On the one hand, Leppänen and Hietanen suggest that pleasant odorants may operate in a mood-congruent manner, by activating positive emotions within the participants, which in turn facilitates access to conceptual knowledge about the target emotion (e.g., smiling faces), yet whether this cognitive facilitation is achieved independently of any emotional change within the participant remains unknown. Indeed, the majority of studies that have examined the emotion inducing properties of different odorants have found significant changes within the participant across a variety of measures (Krippl 2003). For example, at a physiological level, the affective properties of odors have been shown to exert a direct influence on a participant’s level of autonomic nervous system activity, such that an increase in an odor’s subjective pleasantness leads to a decrease in the participant’s heart rate (Bensafi et al. 2002). Furthermore, exposure to different types of odorants such as ylang-ylang, have successfully been found to increase self-reported levels of calmness and reduce anxiety (Moss et al. 2008; Moss et al. 2010; Moss and Oliver 2012). Based on these observations, it is highly plausible for the facilitative effects observed in Leppänen and Hietanen’s study to have occurred as a consequence of a change in the emotional state of the participant. Therefore, it would be important to establish whether the effects of odors on cognitive performance in the FICE task can occur independently of the emotional state of the observer. This will be achieved in the current study by administering a measure of self-reported anxiety, the State-Trait Anxiety Inventory (STAI; Spielberger et al. 1983) in a pre versus post-test measure design. This type of measure has been applied effectively in previous research with FICE tasks (e.g., Damjanovic et al. 2014) and utilized in the perception of chemosensory stimuli (Zhou and Chen 2009) to assess self-reported anxiety in participants. As such, utilizing the STAI in a pre versus post-test design will allow us to assess if the odorants in the current study influence anxiety and if so, what are the implications of such modulations on attentional performance. Specifically, we hypothesized that pleasant odorants would reduce self-reported measures of anxiety, whilst unpleasant odorants would increase self-reported measures of anxiety. Furthermore, if these hypotheses are confirmed in our analyses, we will examine the extent to which such changes in self-reported anxiety mediates the attentional performance in detecting happy facial expressions. Finally, given that differences in experimental design and odor exposure intervals are both likely to influence mood-inducing capacities of odors and other associated affective states (see Seubert et al. 2010b), a further consideration for the present study is to determine whether odors exert a specific time course on emotion perception. This is because the olfactory modality is particularly vulnerable to habituation; with repeated or prolonged exposure to an odorant stimulus, neural sensitivity is reduced, consequently reducing its saliency and priming potential (Dalton 2000). For instance, Moessnang et al. (2011) showed that participants’ spatial attention to locate a target shape presented on the same side as an odorant cue was initially slower at the start of the experiment, but then disappeared over the course of the experiment. Performing a similar time course split on reaction time will help to establish to what extent the search for happy faces remains stable over the course of the experiment. Method Participants A total of 54 Undergraduate and Postgraduate students from the University of Chester were randomly allocated in equal groups to the control (female = 13, male = 5; mean age: 23.72 years, range: 19–39 years), pleasant odor (female = 15, male = 3; mean age: 21.94 years, range: 18–40 years), and the unpleasant odor condition (female = 15, male = 3; 23.11 years, range: 18–48 years). Ethics statement The work with human participants complies with the Declaration of Helsinki for Medical Research involving Human Subjects. The study was also approved by the Department of Psychology Ethics Committee at the University of Chester, United Kingdom. All participants gave written informed consent and were paid £5.00 for participation. Participants self-reported that they had normal to normal-to-corrected vision, normal sense of smell, and no nasal or food allergies and were not experiencing any respiratory problems. Female participants who were pregnant or thought that they might be pregnant were excluded from participating in the study to minimize the risk of nausea. Once each participant’s testing date and time was confirmed, they were reminded to restrict some habits that could affect their ability to smell, such as smoking, drinking coffee and using scented products, on the day of testing. They were also reminded of these restrictions 24 h before their day of testing. Stimuli and apparatus Based on previous research by Leppänen and Hietanen (2003) that utilized an unbalanced design towards positive odors, we selected strawberry (contains: ALDEHYDE C16 (STRAWBERRY PURE), METHYL CINNAMATE, alpha iso methyl ionone, amyl cinnamic aldehyde), vanilla (contains: VANILLIN, limonene, coumarin, ETHYL MALTOL, Tonalid), and orange zest odors (contains: Linalyl Acetate, citral, limonene, linalool) for the pleasant odor condition, and for the unpleasant condition we selected a fish odor (contains: PINE TAR OIL, Alpha-Cedrene) (Boesveldt et al. 2010). All odorants were manufactured and supplied by Dale Air™ in the UK. For the main experiment, the odors were supplied in aerosol form and distributed by a purpose-built dispenser supplied by Dale Air™ positioned 2 m above floor level. The odor release mechanism was set to 20-min intervals. Cotton wads absorbed with the liquid form of the odors and presented in containers were used to collect ratings of arousal and pleasantness in a separate rating study and as part of the odor selection stage of the main experiment. Odor rating study A separate group of 36 student participants from the same population and matched for male-female split as those recruited from the main experiment were randomly allocated in equal groups to the pleasant (female = 15, male = 3; mean age: 30 years, range: 18–53 years) and unpleasant odor conditions (female = 15, male = 3; 22 years, range: 18–42 years). Each participant in the pleasant odor group was presented with 3 individual containers containing the odors and asked to rate each container on pleasantness and arousal using a 5-point scale. Thus, all participants in the pleasant odor group smelled all of the pleasant odors. As per Leppänen and Hietanen’s (2003) pleasantness ratings, participants were instructed to sniff each container and evaluate it on a 5-point Likert scale ranging from 1 (extremely unpleasant) to 3 (neutral) to 5 (extremely pleasant). Measures of arousal were obtained by adapting the instructions and response categories used by Bensafi et al. (2002, p. 705), whereby participants were instructed to “Please judge your feeling when you smelled the odorant by circling the relevant number between 1 (not at all arousing) to 3 (neutral) to 5 (extremely arousing)”. The mean value provided for each odorant for both pleasantness and arousal measures were used to obtain a further measure of perceived intensity by conducting a series of one-sample t-tests with the mid-point value as the hypothetical neutral value. Measures of pleasantness A 1-way independent groups ANOVA revealed a significant difference for pleasantness ratings for strawberry (M = 4.28, SD = 0.96), vanilla (M = 4.28, SD = 0.75), orange zest (M = 3.94, SD = 0.73), and the fish odor (M = 2.17, SD = 1.10), F (3, 68) = 22.94, MSE = .80, P < 0.001, ηp2 = 0.50. Planned comparison t-tests showed that whilst there were no significant differences in pleasantness ratings between strawberry, vanilla and orange (P > 0.05), each pleasant odor however was associated with significantly higher ratings than the fish odor; strawberry, t (34) = 6.15, P < 0.001, d = 2.05, vanilla, t (34) = 6.73, P < 0.001, d = 2.27, orange zest t (34) = 5.73, P < 0.001, d = 1.92. Furthermore, 1 sample t-tests confirmed that both pleasant and unpleasant fish odor ratings differed significantly from the neutral mid-point (P < 0.001), with pleasant odors being rated significantly more towards the pleasant end of the scale and the fish odor being rated significantly more towards the unpleasant end of the scale. Measures of arousal Strawberry (M = 3.44, SD = 1.20), vanilla (M = 3.11, SD = 1.28), orange zest (M = 2.94, SD = 1.47), and the fish odor (M = 2.89, SD = 1.32), did not differ significantly from each other in terms of perceived arousal, F (3, 68) = 0.64, MSE = 1.75, p = 0.590, ηp2 = 0.03 or from the neutral mid-point. Thus, the odors selected for the main experiment differed significantly in terms of their affective valence (pleasant vs. unpleasant), but were not confounded by differences in stimulus arousal. Main experiment Participants in the main experiment were required to rate each odor for perceived pleasantness (1 = extremely unpleasant to 5 = extremely pleasant Likert scale). Each participant in the pleasant odor group was presented with 3 individual containers containing the pleasant odors, whilst participants in the unpleasant odor condition were given the fish odor to rate. Thus, all participants in the pleasant odor group smelled all of the pleasant odors. A 1-way independent groups ANOVA revealed significant differences in pre-experimental ratings for strawberry (M = 4.44, SD = 0.78), vanilla (M = 4.17, SD = 0.62), and orange zest (M = 4.11, SD = 0.76), the overall mean for the selected pleasant odor (M = 4.33, SD = 0.59) and the fish odor (M = 2.0, SD = 0.59), F (4, 85) = 41.20, MSE = 0.46, P < 0.001, ηp2 = 0.66. Mirroring the pattern of results found in the odor rating study, the pleasant odors did not differ significantly from each other (P > 0.05), but each pleasant odor was associated with significantly higher ratings than the fish odor; strawberry, t (34) = 10.55, P < 0.001, d = 3.54, vanilla, t (34) = 10.72, P < 0.001, d = 3.56, orange zest, t (34) = 9.30, P < 0.001, d = 3.10, and the overall mean for the selected (see procedure) odor, t (34) = 11.78, P < 0.001, d = 3.95. Furthermore, the pleasant odors and the unpleasant fish odor ratings differed significantly from the neutral mid-point (P < 0.001). To establish that these differences in pleasantness ratings between the 2 odor conditions were significant at the end of the experiment as well as at the beginning, further between groups comparisons were conducted, as per Leppänen and Hietanen (2003), by using a 20-cm visual analogue scale with the word unpleasant at the left end, neutral in the middle, and pleasant at the right for participants to evaluate the pleasantness of the odor in the room. Responses to the odor evaluations were recorded from 0 to 10 for pleasant responses and from 0 to −10 for unpleasant responses. Participants in the pleasant odor condition rated the odor as significantly more pleasant (M = 6.81, SD = 2.12) than participants with the fish odor (M = −5.59, SD = 4.53), t(34) = 10.52, P < 0.001, d = 3.72. Both ratings differed significantly from the mid-point (i.e., neutral) as revealed by 1 sample t-tests for fish, t(17) = −5.23, p < .001, d = 1.23 and the selection of odors, t(17) = 13.64, P < 0.001, d = 3.21, respectively. Thus, the unpleasant and pleasant evaluations associated with fish and the pleasant odors at the start of the experiment were maintained towards the end of the experiment. Facial expression stimuli Four angry (E1–E4), 4 happy (E33–E36), and 8 neutral (N6, N8, N11, N13, N17, N22, N26, N27) faces were selected from the Caucasian set of Matsumoto and Ekman’s (1988) database. Adobe Photoshop converted each color image to grayscale and applied an oval template (125 pixels wide by 168 pixels high) to remove external features (e.g., hair, ears, neckline). Mean luminance and contrast were matched for all faces such that each face generated an intensity value of 190. Stimulus presentation and data recording was obtained through SuperLab 4.0, using a Mac G4 OSX computer. Design The happiness superiority effect was measured using reaction time (RTs) recorded from the onset of each visual search display to participant response and error rates on “different” display trials (Damjanovic et al. 2010). Participants were randomly allocated to 1 of the 3 groups: control, pleasant or unpleasant. Type of target (angry and happy) and type of distracter (neutral and emotional) were administered as repeated measures variables. The anxiety inducing properties of the odors was established by comparing state anxiety scores pre and post odor exposure. Participants in the control condition were also required to provide self-report measures of their state anxiety, once before completing the visual search task and immediately after its completion. Procedure The procedure involved several measures administered in the following order: rating of the odor(s), state anxiety, visual search task, rating of the odor, and state anxiety. The odor rating measures were not applicable to participants in the control (i.e., no odor condition). For participants in the pleasant odor condition, the odor that they rated the highest for pleasantness was selected for the visual search task, whereas for participants in the unpleasant odor condition it was the fish odor. Thus, participants in the odor conditions were exposed to one odorant for the visual search task. Participants then completed the state (S) component of the STAI (Spielberger et al. 1983) as an index of their baseline anxiety. The visual search task was taken from Experiment 1 in Damjanovic et al.’s (2010) study. Briefly, this consisted of same display trials of 4 different individuals displaying the same emotional expression (i.e., all angry, all happy or all neutral). There were 4 different display trials: 1 angry, 3 neutral; 1 angry, 3 happy, 1 happy, 3 neutral; and 1 happy, 3 angry. The visual search experiment consisted of 96 same-display trials (32 angry, 32 happy, 32 neutral expressions) presented randomly with 128 different-display trials (32 in each of the 4 conditions). A fully counterbalanced design in which each poser provides each expression was not possible to implement in the current study due to the fact that each poser only contributed one facial expression of emotion and one neutral expression to the database (Matsumoto and Ekman 1988; Matsumoto 2002). Each trial began with a fixation cross in the centre of the screen for 500 ms followed by a display of 4 faces surrounding the central fixation point for 800 ms. The 4 faces were arranged in an imaginary circle, occupying top, right, bottom, and left locations on the computer screen, with a fixation cross at the centre viewed at a distance of 60cm. Each face subtended a visual angle of 3.1° horizontally by 4.1° vertically. The centre of each face was 6.2° of visual angle from fixation. The inter trial interval was set to 2000 ms. Participants were instructed to respond as quickly and as accurately as possible whether the 4 faces in the display showed the “same” emotion or whether one was “different” in emotion from the remaining 3 faces by pressing the “x” and “.” key on the keyboard. Response mapping was reversed for half the participants, with feedback in the form of a 1000 ms beep being provided on incorrect trials. Although participants performed the visual search task without a break, our previous work with these tasks has indicated that this does not necessarily induce severe fatigue effects. After the visual search task, participants in the experimental groups used a 20-cm visual analogue scale with the word unpleasant at the left end, neutral in the middle, and pleasant at the right to evaluate the pleasantness of the odor in the room as a post-experiment rating measure. This change in rating method from a 5-point Likert scale to a visual analogue scale follows similar procedural approaches (e.g., Leppänen and Hietanen 2003) and was implemented in the current study to minimize the impact of participants’ responses styles on their odor ratings. Finally, participants were provided with the STAI (S) component to complete. Participants were required to complete the STAI (S) after completing the odor rating in order to replicate the administration of the rating scales used in Leppänen and Hietanen’s (2003) procedure. To assess whether asking participants to give a positive or negative rating for an environmental factor may subsequently raise awareness of this factor and influence their STAI (S) scores, we correlated participants’ pleasantness ratings with their post-experiment STAI(S) score. The relationship between the 2 measures was weak and non-significant, r = 0.07, P = 0.689, providing little evidence to suggest that rating an odorant’s pleasantness is significantly associated with self-reported ratings of state anxiety. Once the STAI was completed, participants were debriefed and thanked for their time. Results Anxiety-inducing properties of odors To test our specific hypotheses that the pleasant odorant would result in a decrease in self-reported anxiety, the unpleasant odorant an increase in self-reported anxiety, and the control condition resulting in a non-significant change, a 3 (group: control, pleasant or unpleasant) × 2 (time: before and after) mixed ANOVA with repeated measures on the last factor was applied to the participants’ STAI-S scores (see Table 1). There was no significant effect of group, F (2, 51) = 0.04, MSE = 132.02, P = 0.962, ηp2 = 0.00 or time, F (1, 51) = 0.02, MSE = 26.77, P = 0.882, ηp2 = 0.00 However, the group × time interaction was significant, F (2, 51) = 7.05, MSE = 26.77, P = 0.002, ηp2 = 0.22. There were no significant group differences in state anxiety at baseline, F (2, 102) = 1.35, MSE = 79.39, P = 0.265, ηp2 = 0.03 or post-test, F (2, 102) = 1.10, MSE = 79.39, P = 0.339, ηp2 = 0.02. However, self-reported anxiety levels changed within each group, such that towards the end of the experiment, anxiety levels significantly decreased in participants exposed to the pleasant odors F (1, 51) = 8.59, MSE = 26.77, P = 0.005, ηp2 = 0.14, but increased for participants exposed to the unpleasant odor, F (1, 51) = 4.25, MSE = 26.77, P < 0.044, ηp2 = 0.08. Pre versus post-test changes in anxiety did not differ significantly for participants in the control group, F (1, 51) = 1.27, P = 0.264, ηp2 = 0.02. Table 1. Mean STAI (S) pre and post test measures as a function of odor context.     Context  Measure  Control  Pleasant  Unpleasant    Mean  (SE)  Mean  (SE)  Mean  (SE)  Pre-STAI (S)  33.56  (2.01)  36.78  (2.58)  32.00  (2.02)  Post-STAI (S)  35.50  (2.12)  31.72  (1.57)  35.56  (2.17)      Context  Measure  Control  Pleasant  Unpleasant    Mean  (SE)  Mean  (SE)  Mean  (SE)  Pre-STAI (S)  33.56  (2.01)  36.78  (2.58)  32.00  (2.02)  Post-STAI (S)  35.50  (2.12)  31.72  (1.57)  35.56  (2.17)  Standard errors are presented in parentheses. View Large Visual search performance As per Damjanovic et al. (2010), only performance on discrepant trials were examined. Reaction time (RT) for correct responses on different display trials were filtered (<100 ms or > 2000 ms) for analysis. To test the hypothesis that the underlying mechanism of the happiness superiority effect is a perceptual one a 3 (group: control, pleasant, or unpleasant) × 2 (target: angry and happy) × 2 (distracter: emotional and neutral) mixed ANOVA with repeated measures on the last 2 factors was conducted. The main effect of group was not significant, F (2, 51) = .40, MSE = 196657.53, P = 0.676, ηp2 = 0.02 . The initial results replicated a happiness superiority effect, F (1, 51) = 30.46, MSE = 3871.58, P < 0.001, ηp2 = 0.37, and participants were faster to detect a target when it was surrounded by emotional than neutral distracter faces, F (1, 51) = 73.58, MSE = 3974.857, P < 0.001, ηp2 = 0.59. Type of target and type of distracter interacted significantly with each other F (1, 51) = 61.48, MSE = 2881.73, P < 0.001, ηp2 = 0.55, with the happiness superiority effect occurring with neutral distracters F (1, 102) = 86.51, MSE = 3376.66, P < 0.001, ηp2 = 0.46, but not with emotional distracters, F (1, 102) = 0.89, MSE = 3376.66, P = 0.348, ηp2 = 0.01. Angry face targets were found faster overall when they were surrounded by emotional distracters (i.e., happy faces) than when surrounded by neutral distracters, F (1, 102) = 134.90, MSE = 3428.29, P < 0.001, ηp2 = 0.60, whereas overall response times for happy face targets was equivalent for emotional and neutral distractors, F (1, 102) = 2.10, MSE = 3428.29, P = 0.151, ηp2 = 0.02. The effect of target interacted significantly with group F (2, 51) = 4.58, MSE = 3871.58, P = 0.015, ηp2 = 0.15, producing the happiness superiority effect in the control F (1, 51) = 24.83, MSE = 3871.58, P < 0.001, ηp2 = 0.33 (See Figure 1A) and unpleasant groups F (1, 51) = 14.11, MSE = 3871.58, P < 0.001, ηp2 = 0.22 (See Figure 1C), but is eliminated in the pleasant group, F (1, 51) = 0.67, MSE = 3871.58, p = .416, ηp2 = .01 (See Figure 1B). The 3-way interaction between group × target × distracter did not reach significance, F (2, 51) = 0.45, MSE = 2881.73, p = .644, ηp2 = 0.02. Figure 1. View largeDownload slide Visual search data for different displays. The left panels show mean reaction time and the right panels show mean error rates to detect the angry and happy facial expression targets against emotional (E) and neutral (N) distracters for the control (A), pleasant odor (B), and unpleasant odor (C) conditions. Error bars correspond to the standard errors of the mean of each condition individually. Figure 1. View largeDownload slide Visual search data for different displays. The left panels show mean reaction time and the right panels show mean error rates to detect the angry and happy facial expression targets against emotional (E) and neutral (N) distracters for the control (A), pleasant odor (B), and unpleasant odor (C) conditions. Error bars correspond to the standard errors of the mean of each condition individually. Analysis of error rates revealed significantly lower error rates associated with happy targets compared to angry targets, F (1, 51) = 167.60, MSE = 118.18, P < 0.001, ηp2 = 0.77 (See Figure 1) and with emotional distracters compared with neutral ones, F (1, 51) = 135.33, MSE = 103.66, P < 0.001, ηp2 = 0.73. A significant target × distracter interaction F (1, 51) = 86.73, MSE = 132.72, P < 0.001, ηp2 = .63 revealed low error rates for happy targets with both emotional (F (1, 102) = 245.18, MSE = 125.45, P < 0.001, ηp2 = 0.71) and neutral distracters F (1, 102) = 4.46, MSE = 125.45, P = 0.037, ηp2 = 0.04. For angry targets, detection accuracy was considerably better with emotional distracters F (1, 102) = 215.56, MSE = 118.19, P < 0.001, ηp2 = 0.68, but there was no significant effect of surrounding distracter context on error rates for happy targets, F (1, 102) = 0.53, MSE = 118.19, P = 0.470, ηp2 = 0.01. The 3-way interaction between group × target × distracter did not reach significance, F (2, 51) = 0.61, MSE = 132.72, P = 0.549, ηp2 = 0.02. These results show that whilst there is an overall search advantage favoring happy facial expressions, this advantage is modulated by the affectively valenced environmental cues. Furthermore, the presence of the group × target interaction on response times, indicates that such cues exert a stronger effect on processing speed than on accuracy. The emotionality of the competing distracter faces produces similar effects on both search and accuracy measures. Stability of the search advantage for happy faces Habituation effects were investigated by calculating a single computation for each participant for each distracter context, the happiness superiority index (HSI). This calculation involved the mean RTs for all angry targets minus mean RTs for all happy targets, for the first and last 25% of trials for each participant, with a positive value indicating faster detection times for happy faces. A 3 (group: control, pleasant, or unpleasant) × 2 (distracter: emotional and neutral) × 2 (phase: first quarter and last quarter) mixed ANOVA with repeated measures on the last 2 factors (see Figure 2) revealed no significant effects of group F (2, 51) = 0.15, MSE = 32 876.65, P = 0.859, ηp2 = .01 or phase F (1, 51) = 0.10, MSE = 24 868.75, p = .752, ηp2 = 0.00. Greater levels of happiness superiority were observed with neutral relative to emotional distracters F (1, 51) = 53.26, MSE = 20 381.67, P < 0.001, ηp2 = 0.51 (See Figure 2). Figure 2. View largeDownload slide Superiority index for happiness for the first and last 25% of search trials in milliseconds (ms) with emotional and neutral distractors across the 3 experimental contexts. A positive score indicates faster detection of happy over angry face targets on different display trials. Error bars correspond to the standard errors of the mean of each condition individually. Figure 2. View largeDownload slide Superiority index for happiness for the first and last 25% of search trials in milliseconds (ms) with emotional and neutral distractors across the 3 experimental contexts. A positive score indicates faster detection of happy over angry face targets on different display trials. Error bars correspond to the standard errors of the mean of each condition individually. The only significant higher order interaction to emerge from the analyses was for group × phase, F (2, 51) = 6.05, MSE = 24 868.75, P = 0.004, ηp2 = 0.19, with simple main effects revealing higher levels of happiness superiority in the last quarter of the experiment, F (2, 102) = 3.62, MSE = 28 872.02, P = 0.030, ηp2 = .07, an effect which was limited to the unpleasant odor group (Tukey P < 0.05). For participants in the pleasant odor group, the magnitude of the happiness superiority effect was stronger at the start of the visual search task than towards the end, F (1, 51) = 4.44, MSE = 24 868.75, P = 0.040, ηp2 = 0.08. This pattern was reversed for participants in the unpleasant group, F (1, 51) = 7.74, MSE = 24 868.75, P = 0.01, ηp2 = 0.13. The happiness superiority effect did not differ between the start and end of the visual search task for participants in the control group, F (1, 51) = 0.02, MSE = 24 868.75, P = 0.901, ηp2 = 0.00. The role of self-reported anxiety in mediating the impact of scent pleasantness on the happiness superiority effect Given that our hypotheses relating to the anxiety modulating properties of the odorants were supported, the following analyses investigated whether the significant changes in self-reported anxiety reported in the 2 odor groups (see Table 1) could impact on the HSI at the start and towards the end of the visual search task (see Figure 2). To achieve this, a change in state anxiety variable was computed (STAI-CHANGE: STAI-Safter – STAI-Sbefore) with a negative value indicating a decrease in anxiety, and a positive value indicating an increase in anxiety as a function of odor exposure. Focusing on the significant effects found with neutral distracters, odor type (coded: 0 = pleasant, 1 = unpleasant) correlated positively with changes in self-reported anxiety, rpb = 0.54, P < 0.001, such that the unpleasant odor was strongly associated with increased levels of self-reported anxiety. Furthermore, the pleasantness of odor type almost yielded a significant relationship with search times for the HSI at the start of the experiment, resulting in a greater HSI with pleasant than unpleasant odors, rpb = −0.31, P = 0.065. The relationship between changes in anxiety and HSI, was negligible, r = −0.02, P = 0.913. In contrast to the patterns observed at the start, as participants approached the end of the visual search task, odor pleasantness was found to significantly correlate with HSI, such that the unpleasant odor was moderately associated with higher levels of happiness detection, rpb = 0.37, P = 0.026. The relationship between changes in anxiety and HSI performance was weak and not significant, rpb = 0.28, P = 0.104. Given the small sample sizes (Shrout and Bolger 2002; Preacher and Hayes 2008), 2 separate bootstrapped hierarchical regressions for search times at the start and at the end of the visual search task were performed to test the degree to which odor type and changes in state anxiety could predict the magnitude of the HSI. These results are summarized in Table 2. At the start of the experiment, type of odor (block 1) accounted for 9.7% of the variation in detecting happy faces, F (1, 34) = 3.64, MSE = 21 702.49, P = 0.065. The inclusion of changes in state anxiety in block 2, accounted for a further 3.1%, but this did not significantly improve the ability of the model to predict happiness detection performance, F (2, 33) = 2.41, MSE = 21 594.90, P = 0.105. Inspecting the bootstrapped unstandardized b-values, odor type in block 1 approached significance (P = 0.058), but gained full predictive value when included with changes in state anxiety in block 2 (P = 0.027), whilst changes in the state anxiety failed to gain any significant value in predicting happiness detection (P = 0.257). The configuration of these effects is consistent with classical suppression found in regression analyses (e.g., Horst 1941; McFatter 1979;,Paulhus et al. 2004; MacKinnon et al. 2007) and demonstrates that knowing how a participant responds emotionally to the odorant during initial stages of exposure significantly improves the detection of emotional facial cues. In the case of the current study, reductions in anxiety created by exposure to pleasant odors facilitate the detection of happy face targets. Table 2. Bootstrapped hierarchical mediation methods on the effect of odor type on the happiness superiority effect (block 1) as mediated through changes in self-reported state anxiety (block 2) for the first quarter and last quarter of the visual search trials.   First quarter  Last quarter    B  SE B  B  SE B  Block 1   Constant  164.31  39.22  65.15  37.88   Odor type  −93.64  47.87  122.34*  53.08  Block 2   Constant  183.39  41.15  75.77  44.20   Odor type  −127.40*  51.10  103.55  67.96   STAI-CHANGE  3.77  3.59  2.10  3.61    First quarter  Last quarter    B  SE B  B  SE B  Block 1   Constant  164.31  39.22  65.15  37.88   Odor type  −93.64  47.87  122.34*  53.08  Block 2   Constant  183.39  41.15  75.77  44.20   Odor type  −127.40*  51.10  103.55  67.96   STAI-CHANGE  3.77  3.59  2.10  3.61  Note: Estimates are unstandardized; odor type is coded 0 = pleasant, 1 = unpleasant; 1000 bootstrap samples; * <.05. View Large In a second hierarchical regression conducted on performance towards the end of the experiment, type of odor (block 1) significantly predicted happiness detection, accounting for 13.8% of the variance in performance, F (1, 34) = 5.44, MSE = 24 754.00, P = 0.026, but adding change in anxiety scores (block 2) only increased the variance accounted for by a non-significant 0.8%, F (2,33) = 2.82, MSE = 25 267.18, P = 0.074. Inspecting the bootstrapped unstandardized b-values, odor type in block 1 significantly predicted the detection of happiness, such that the presentation of unpleasant odors improved the magnitude of the HSI (P = 0.028). However, when changes in state anxiety were controlled for (block 2), neither odor type (P = 0.14) nor anxiety change (P = 0.531) were able to significantly predict HSI performance. In this instance, adding changes in anxiety to the model resulted in a redundancy situation (Paulhus et al. 2004), accounting for less than 1% of performance. Thus, towards the end of the experiment, the detection of happy faces is based entirely on the odorant and is independent of any emotional changes that occur within the participant as a result of the odorant prime. In this instance, the role of the odorant prime is reversed, such that participants who were exposed to the unpleasant odor showed improved detection of happy faces relative to participants who were exposed to the pleasant odors. Discussion In the present study, we investigated the influence of olfactory environmental context on the perception of facial expressions of emotion conveying happiness and anger. Whilst some of the earlier work in this area appears to suggest category-specific facilitative effects of pleasant odorant primes on the processing of happy facial expressions (e.g., Leppänen and Hietanen 2003), very little is known about the generalizability of these findings under more complex visual processing tasks or the possible underlying mechanism that may support the cross-modal integration of affective cues. To address these issues, we used the FICE task to examine whether the concurrent presentation of pleasant and unpleasant odorant cues affects the spatial distribution of attentional resources towards happy face targets, and also compared self-reported measures of anxiety to evaluate the extent to which these odors might alter the emotional state of the participant. In the control condition, participants were significantly faster and less error prone to detect a discrepant happy face target in a crowd of competing distracter faces, a finding which is consistent with earlier work with this particular version of the FICE task (Damjanovic et al. 2010). However, whilst this overall search advantage for happy faces was observed in participants exposed to the unpleasant odor, it was abolished for participants exposed to the pleasant odors. An initial consideration of these patterns may at first appear difficult to reconcile with the category-specific facilitative effects reported in Leppänen and Hietanen’s emotion categorization study. However, further analyses taking on board the recommendations made in Moessnang et al.’s (2011) work, reveals 2 important characteristics on the time course effects of odors on the perception of happy facial expressions: (i) pleasant odors facilitate the detection of happiness, but the benefits are short-lived, and (ii) unpleasant odors help in the detection of happy faces, but only towards the end of the visual search task. The reversal of such phasic optimization effects from the 2 different odorants could be accounted for in terms of the emotional state of the participant (Leppänen and Hietanen 2003). The significant differences between the pleasantness ratings for the odors were maintained at the start of the experiment as well as towards the end, yet were matched for their overall arousal, thus ruling out potential differences between the 2 experimental contexts based in terms of differences in arousal. Furthermore, the odors produced differential effects on the participants’ levels of self-reported anxiety as measured by the STAI, such that individuals who were exposed to the pleasant odor showed a reduction in their overall anxiety, whilst participants exposed to the unpleasant odor demonstrated an increase in their anxiety. Previous considerations of the participant’s emotional state on facial expression processing have either been made indirectly, in the form of potential mood congruency effects as per Leppänen and Hietanen’s (2003) work, or directly by collecting ratings from participants about the extent of their current experiences of happiness and sadness (e.g., Niedenthal et al. 2000). For example, mood induction techniques leading to higher levels of self-reported happiness or sadness resulted in participants perceiving the mood congruent facial expressions for a longer time than control participants. Furthermore, identifying the beneficial effects of certain odorants has predominantly focused on their anxiety reducing capacities, rather than specifically identifying whether they improve an individual’s own happiness (e.g., Moss et al. 2008; Zhou and Chen 2009; Moss et al. 2010; Moss and Oliver 2012). In line with this work, emotional change in this study was operationalized in terms of changes in self-reported state anxiety using the STAI, and whilst a reduction in anxiety would be viewed as a positive feature of the odorant, it is not possible to establish from the present data the extent to which pleasant odorants directly improved the participant’s own levels of happiness and any subsequent role this may have had in detecting happiness in the FICE task. Taking into account the anxiety-modulating capacities of the odorants at the start of the visual search task allows the priming capacities of the odorant to take effect, such that participants performing the task with the concurrent presentation of a pleasant odorant, where overall levels of anxiety are reduced, showed enhanced detection of happy face targets relative to participants exposed to the unpleasant odorant. This particular pattern extends Leppänen and Hietanen’s work by demonstrating how positive changes within the participant, as indexed in the current study in the form of reductions in self-reported anxiety, can facilitate the perception of happy facial expressions during more complex attentional tasks. However, the benefits of this reduction in anxiety on the perception of happiness is short-lived; towards the end of the visual search task, the anxiety-modulating capacities of odors become redundant in predicting the detection of happy targets. Surprisingly, unpleasant odors rather than pleasant ones facilitate the detection of happy faces. We argue that the unpleasant odor in the latter stage of the search process may serve to create a “pop-out” environmental context for the participants, directing their attention to environmentally incongruent emotional information (i.e., happy faces) as they engage in avoidance based strategies in response to inhaling the unpleasant odor (e.g., Fannes et al. 2008; Boesveldt et al. 2010). Thus, the increase in the happiness superiority effect towards the end of the experiment and its overall preservation in the unpleasant condition may result from successful negative affect repair processes to offset this increase in anxiety and any such threat-related cognitive biases associated with it (Byrne and Eysenck 1995). The fact that the unpleasant odor plays a more salient role towards the end of the experiment may reflect differences in habituation patterns between positive and negative odorant cues, with unpleasant odors taking considerably more time to habituate, especially in experimental designs in which participants are explicitly primed by rating an odorant for its level of unpleasantness (Dalton 1996; see also Seubert et al. 2014). Current explanations of the happiness superiority effect focus on the role of low level perceptual advantages afforded by the ‘toothy grin’ in happy facial expressions (e.g., Calvo and Nummenmaa 2008; Calvo and Marrero 2009; Horstmann et al. 2012). In the current study, the facial expressions used to measure the happiness superiority effect were taken from a database which included happy face exemplars with visually salient smiles, whilst the angry face exemplars lacked a visually salient ‘toothy snarl’ equivalent feature (Lipp et al. 2009; Horstmann et al. 2012). As such, angry faces would have shared a greater degree of perceptual overlap with neutral faces which also included this closed mouth feature, materializing in increased error rates and response times for angry target-neutral distracter crowds across the different conditions. The perceptual disadvantage of angry face targets over happy targets was reduced when the surrounding crowd consisted of happy distracters, in these conditions search performance was comparable to happy target-angry crowd search conditions, thus resulting in a happiness superiority effect that was only found in the neutral distracter context. In some instances, such interactions between target and distractors point towards the involvement of an attentional disengagement mechanism, whereby response times to detect happy targets is delayed when surrounded by angry distractor faces than neutral faces (e.g., Fox et al. 2000; Fox et al. 2001; Fox et al. 2002; Damjanovic et al. 2014). However, inferential analyses do not indicate that such a mechanism was involved in the current visual search task, as participants’ response times for happy target detection did not differ significantly between the angry and neutral distractors. Whilst the stability of the happiness superiority effect in the control condition would have been compatible with such perceptual based accounts of emotion detection, the time course analysis in the odor groups indicates that affective factors play a much more important role (e.g., Becker et al. 2011). Our early socialization experiences (Bond and Siddle 1996; Kotsoni et al. 2001) may help to reinforce an expectancy bias towards positive stimuli (Cacioppo and Gardner 1999; Cacioppo et al. 1999; Chipchase and Chapman 2007), yet the phasic characteristics of participants’ visual search times for happy facial expression targets, reveal how easily this bias can be reset. We propose that the emotional state of the participant plays an important role in the perception of facial expression of happiness (e.g., Niedenthal et al. 2000; Leppänen and Hietanen 2003; Becker et al. 2011), supporting the cross-modal interaction of affective cues in a time dependent manner (Walla 2008). Phasic analyses such as the ones performed in this study not only serve to highlight the complexity of such cross-modal interactions, but may also help pave the way for a better understanding on how affective versus perceptual accounts on emotion detection can be disentangled. Furthermore, such phasic analyses prove to be particularly helpful in reconciling differences between studies that would otherwise have been masked if the analyses and interpretation of olfactory-visual processing focused exclusively on overall reaction time performance (Moessnang et al. 2011). Future efforts in validating affective accounts of the happiness superiority effect may attempt to increase the arousal value of unpleasant odorants, for example, by using human chemosignals (e.g., Lundström et al. 2008; Zhou and Chen 2009) to establish whether they can open up the prioritization of threat specific cues. Thus, when environmental contexts differ not only in their pleasantness value, but also in terms of their heightened arousal levels, participants may then revert to searching for angry facial expressions instead (Becker et al. 2011; Chipchase and Chapman 2013; Lundqvist et al. 2013). Such systematic manipulations of an odor’s pleasantness and arousal values would also raise important implications for our understanding of anxiety-based models of attention (e.g., Eysenck 1992; Eysenck et al. 2007), and how this may differ functionally from a general expectancy bias favoring positive information (e.g., Diener and Diener 1996; Fox 2013), for example. Indeed, it is worth noting that the mean STAI (S) values for all 3 groups fell within the low-anxiety range reported in visual search studies of this kind. For instance, some studies using a split-groups design often pre-select individuals scoring high in trait anxiety (48>), and have identified this anxiety component as playing a more important role than high levels of state anxiety in facilitating the detection of angry facial expressions (e.g., Byrne and Eysenck 1995). Other studies have pre-screened participants on the basis of their state anxiety scores, allocating scores above 40 on the STAI to the high anxious group and scores below 35 to the low anxious group and have found this component of anxiety to play a stronger role in disrupting threat disengagement attentional processes (e.g., Fox et al. 2001). Thus, future studies should systematically consider both intrinsic and extrinsic facets of anxiety and their emotional weighting in terms of the attentional capture of angry and happy facial expressions. Our work indicates that the anxiety-modulating capacities of pleasant and unpleasant odors may serve as a useful tool towards achieving this aim in sub-clinical populations (Krippl 2003). Beyond the differences in attentional processing demands between the current study and the work by Leppänen and Hietanen, other aspects of our methodology may explain why the facilitative effects of the pleasant odor condition on the detection of happiness did not materialize in the initial analysis of overall reaction time performance. For example, the correspondence between the pleasantness value of the olfactory cues used by Leppänen and Hietanen and the facial stimulus may have been more strongly primed than in the current study given that participants were only required to make “same” versus “different” emotion judgments on each experimental trial, in contrast to Leppänen and Hietanen’s instructions which required participants to categorize each trial using emotion labelled response keys. Indeed, providing participants with emotion labels during the categorization task can also influence the percentage of intrusion errors, that is the false categorization of emotional expressions that were not presented in the target face such as perceiving sadness in an ambiguous neutral-disgust face (Leleu et al. 2015). Thus, the presence of verbal cues may have primed the cognitive system to search for a restricted set of emotion categories, amplifying the perception of salient congruent expressions such as happiness, whilst blurring the boundaries between less salient expressions such as disgust (Leleu et al. 2015). Furthermore, unlike Leppänen and Hietanen’s categorization task, the participants in the current study performed over 200 visual search trials without a break. Whilst this may have resulted in severe fatigue effects, no phasic effects were observed in the control condition, indicating that this was not a particular cause for concern with the current study. Nevertheless, the inclusion of a break in Leppänen and Hietanen’s study may have helped to increase the saliency of the pleasant odorant prime and its association with the happy facial expression in the emotion categorization task, thus resulting in the overall category-specific facilitative effects found. In interpreting these results, some limitations must be considered. First, rather than relying exclusively on self-report measures to determine functional olfactory sense, such measures should be combined with a screening test such as the Sniffin’ Sticks battery (Hummel et al. 2001) to ensure all participants could perceive the applied odors to normative levels. Furthermore, tighter control of adaptation mechanisms would help to provide a more informative picture of odorant-specific reduction in sensitivity and perceived intensity during odor exposure over the course of the experiment and its impact on cognitive performance (Ekman et al. 1967; Dalton et al. 1996). The addition of post-experiment interviews to assess each participant’s awareness of the odors in their testing environment would also provide a fruitful insight into the perceived intensity of each odorant in future research designs. For example, adding questions with respect to the explicit ability to perceive the odor in the room will help determine whether the contextual effects of odors on facial detection require explicit awareness or can occur implicitly. Second, alternative self-report measures such as the Positive and Negative Affect Schedule (PANAS; Watson et al. 1988) would provide a more detailed account of the types of emotions experienced by participants in response to pleasant and unpleasant odor contexts above and beyond their anxiety-modulating capacities and provide a more comprehensive way to assess the category-specific basis of odour-emotion perception interactions more fully. Notwithstanding these limitations, our study shows that affective factors in the form of changes in the emotional state of the participant play a more significant role in facilitating the detection of target facial expressions of emotion than the perceptual salience of the face’s features (Juth et al. 2005; Calvo and Nummenmaa 2008; Calvo and Marrero 2009; Becker et al. 2011). Indeed, contextual factors are a part of the multisensory nature of our emotional interaction with others, and the dynamic nature of the emotional state of the participant needs to play more of an active role in future research studies on attentional modulation, rather than limiting such investigations to an individual differences framework (Frischen et al. 2008). Along with music induction experiments (e.g., Garon et al. 2012; Rowe et al. 2007), our study reveals how odors may provide another useful tool for researchers to examine the role of affective factors in visual search tasks with emotionally salient stimuli. Funding This work was supported by the Departmental Seed Corn Fund and the Research Development Fund awarded to the first and second authors from the Research and Knowledge Transfer Office at the University of Chester. The research of L Damjanovic has been partly supported by the British Academy, funded under the Skills Acquisition Award scheme SQ130002. Acknowledgements We are grateful to an anonymous reviewer for recommending the mediation analysis during an earlier review of this manuscript. The authors wish to thank Panos Athanasopoulos for helpful discussions on earlier versions of the manuscript and Sam Roberts for advice on mediation analyses. We wish to acknowledge the assistance of Bryan Hiller and Rhian McHugh for facilitating with data collection. References Adolphs R. 2002. Recognizing emotion from facial expressions: psychological and neurological mechanisms. Behav Cogn Neurosci Rev . 1: 21– 62. Google Scholar CrossRef Search ADS PubMed  Becker DV, Anderson US, Mortensen CR, Neufeld SL, Neel R. 2011. The face in the crowd effect unconfounded: happy faces, not angry faces, are more efficiently detected in single- and multiple-target visual search tasks. J Exp Psychol Gen . 140: 637– 659. Google Scholar CrossRef Search ADS PubMed  Becker DV, Neel R, Srinivasan N, Neufeld S, Kumar D, Fouse S. 2012. The vividness of happiness in dynamic facial displays of emotion. PLoS One . 7: e26551. Google Scholar CrossRef Search ADS PubMed  Bensafi M, Rouby C, Farget V, Bertrand B, Vigouroux M, Holley A. 2002. Autonomic nervous system responses to odours: the role of pleasantness and arousal. Chem Senses . 27: 703– 709. Google Scholar CrossRef Search ADS PubMed  Black SL. 2001. Does smelling granny relieve depressive mood?: Commentary on ‘Rapid mood change and human odors’. Biol Psychol . 5: 215– 218. Google Scholar CrossRef Search ADS   Boesveldt S, Frasnelli J, Gordon AR, Lundström JN. 2010. The fish is bad: negative food odors elicit faster and more accurate reactions than other odors. Biol Psychol . 84: 313– 317. Google Scholar CrossRef Search ADS PubMed  Bond NW, Siddle DAT. 1996. The preparedness account of social phobia: some data and alternative explanations. In: Rapee RM, editor. Current controversies in the anxiety disorders . London: Guilford Press. p. 291– 316. Botvinick M, Nystrom LE, Fissell K, Carter CS, Cohen JD. 1999. Conflict monitoring versus selection-for-action in anterior cingulate cortex. Nature . 402: 179– 181. Google Scholar CrossRef Search ADS PubMed  Byrne A, Eysenck MW. 1995. Trait anxiety, anxious mood, and threat detection. Cognition Emotion . 9: 549– 562. Google Scholar CrossRef Search ADS   Cacioppo JT, Gardner WL. 1999. Emotion. Annu Rev Psychol . 50: 191– 214. Google Scholar CrossRef Search ADS PubMed  Cacioppo JT, Gardner WL, Berntson GG. 1999. The affect system has parallel and integrative processing components: Form follows function. J Pers Soc Psychol . 76: 839– 855. Google Scholar CrossRef Search ADS   Calvo MG, Marrero H. 2009. Visual search of emotional faces: the role of affective content and featural distinctiveness. Cognition Emotion . 22: 1– 25. Calvo MG, Nummenmaa L. 2008. Detection of emotional faces: salient physical features guide effective visual search. J Exp Psychol Gen . 137: 471– 494. Google Scholar CrossRef Search ADS PubMed  Chipchase S, Chapman P. 2007. The mere exposure effect with emotionally valenced stimuli: Analytic and non-analytic processing. P Exptl Psych Soc . 60: 1697– 1715. Chipchase S, Chapman P. 2013. Trade-offs in visual attention and the enhancement of memory specificity for positive and negative emotional stimuli. Q J Exp Psychol . 66: 77– 298. Google Scholar CrossRef Search ADS   Chu S, Downes JJ. 2000. Odour-evoked autobiographical memories: psychological investigations of proustian phenomena. Chem Senses . 25: 111– 116. Google Scholar CrossRef Search ADS PubMed  Dalton P. 2000. Psychophysical and behavioral characteristics of olfactory adaptation. Chem Senses . 25: 487– 492. Google Scholar CrossRef Search ADS PubMed  Dalton P. 1996. Odor perception and beliefs about risk. Chem Senses . 21: 447– 458. Google Scholar CrossRef Search ADS PubMed  Dalton P, Wysocki CJ. 1996. The nature and duration of adaptation following long-term odor exposure. Percept Psychophys . 58: 781– 792. Google Scholar CrossRef Search ADS PubMed  Damjanovic L, Meyer M, Sepulveda F. 2017. Raising the alarm: individual differences in the perceptual awareness of masked facial expressions. Brain Cogn . 114: 1– 10. Google Scholar CrossRef Search ADS PubMed  Damjanovic L, Pinkham AE, Clarke P, Phillips J. 2014. Enhanced threat detection in experienced riot police officers: cognitive evidence from the face-in-the-crowd effect. Q J Exp Psychol (Hove) . 67: 1004– 1018. Google Scholar CrossRef Search ADS PubMed  Damjanovic L, Roberson D, Athanasopoulos P, Kasai C, Dyson M. 2010. Searching for happiness across cultures. J Cogn Cult . 10: 85– 107. Google Scholar CrossRef Search ADS   Damjanovic L, Santiago J. 2016. Contrasting vertical and horizontal representations of affect in emotional visual search. Psychon Bull Rev . 23: 62– 73. Google Scholar CrossRef Search ADS PubMed  Diener E, Diener C. 1996. Most people are happy. Psychol Sci . 7: 181– 185. Google Scholar CrossRef Search ADS   Ekman G, Berglund B, Berglund U, Lindvall T. 1967. Perceived intensity of odor as a function of time of adaptation. Scand J Psychol . 8: 177– 186. Google Scholar CrossRef Search ADS PubMed  Ekman P, Freisen WV. 1976. Pictures of facial affect . Palo Alto, CA: Consulting Psychologists Press. Ekman P, Freisen WV, Hager JC. 2002. The facial action coding system  ( 2nd ed.). Salt Lake City, UT Research Nexus eBook. Eysenck MW. 1992. Anxiety: the cognitive perspective . Hillsdale, NJ: Lawrence Erlbaum Associates. Eysenck MW, Derakshan N, Santos R, Calvo MG. 2007. Anxiety and cognitive performance: attentional control theory. Emotion . 7: 336– 353. Google Scholar CrossRef Search ADS PubMed  Fannes S, Van Diest I, Meulders A, De Peuter S, Vansteenwegen D, Van den Bergh O. 2008. To inhale or not to inhale: conditioned avoidance in breathing behavior in an odor–20% CO2 paradigm. Biol Psychol . 78: 87– 92. Google Scholar CrossRef Search ADS PubMed  Fox E. 2013. Rainy brain, sunny brain: The new science of optimism and pessimism . London: Arrow Books. Fox E, Lester V, Russo R, Bowles RJ, Pichler A, Dutton K. 2000. Facial expressions of emotion: are angry faces detected more efficiently? Cogn Emot . 14: 61– 92. Google Scholar CrossRef Search ADS PubMed  Fox E, Damjanovic L. 2006. The eyes are sufficient to produce a threat superiority effect. Emotion . 6: 534– 539. Google Scholar CrossRef Search ADS PubMed  Fox E, Russo R, Bowles R, Dutton K. 2001. Do threatening stimuli draw or hold visual attention in subclinical anxiety? J Exp Psychol Gen . 130: 681– 700. Google Scholar CrossRef Search ADS PubMed  Fox E, Russo R, Dutton K. 2002. Attentional bias for threat: evidence for delayed disengagement from emotional faces. Cogn Emot . 16: 355– 379. Google Scholar CrossRef Search ADS PubMed  Frischen A, Eastwood JD, Smilek D. 2008. Visual search for faces with emotional expressions. Psychol Bull . 134: 662– 676. Google Scholar CrossRef Search ADS PubMed  Garon M, Sirois S, Blanchette I. ( 2012). L’humeur induite par la musique influence le traitement visuel de la menace. Congrès annuel de la Société Québécoise pour la Recherche en Psychologie. Sherbrooke, QC. Hager JC, Ekman P. 1979. Long-distance transmission of facial affect signals. Ethol Sociobiol . 1: 77– 82. Google Scholar CrossRef Search ADS   Hansen CH, Hansen RD. 1988. Finding the face in the crowd: an anger superiority effect. J Pers Soc Psychol . 54: 917– 924. Google Scholar CrossRef Search ADS PubMed  Horst P. 1941. The role of predictor variables which are independent of the criterion. Soc Sci Res Bull . 48: 431– 436. Horstmann G, Lipp OV, Becker SI. 2012. Of toothy grins and angry snarls–open mouth displays contribute to efficiency gains in search for emotional faces. J Vis . 12: 7. Google Scholar CrossRef Search ADS PubMed  Hummel T, Konnerth CG, Rosenheim K, Kobal G. 2001. Screening of olfactory function with a four-minute odor identification test: reliability, normative data, and investigations in patients with olfactory loss. Ann Otol Rhinol Laryngol . 110: 976– 981. Google Scholar CrossRef Search ADS PubMed  Juth P, Lundqvist D, Karlsson A, Ohman A. 2005. Looking for foes and friends: perceptual and emotional factors when finding a face in the crowd. Emotion . 5: 379– 395. Google Scholar CrossRef Search ADS PubMed  Kotsoni E, de Haan M, Johnson MH. 2001. Categorical perception of facial expressions by 7-month-old infants. Perception . 30: 1115– 1125. Google Scholar CrossRef Search ADS PubMed  Krippl M. 2003. Induction of emotions and motivations by odours. Chem Senses . 28: 71– 77. Google Scholar CrossRef Search ADS   Leleu A, Demily C, Franck N, Durand K, Schaal B, Baudouin JY. 2015. The odor context facilitates the perception of low-intensity facial expressions of emotion. PLoS One . 10: e0138656. Google Scholar CrossRef Search ADS PubMed  Leppänen JM, Hietanen JK. 2003. Affect and face perception: odors modulate the recognition advantage of happy faces. Emotion . 3: 315– 326. Google Scholar CrossRef Search ADS PubMed  Li W, Moallem I, Paller KA, Gottfried JA. 2007. Subliminal smells can guide social preferences. Psychol Sci . 18: 1044– 1049. Google Scholar CrossRef Search ADS PubMed  Lipp OV, Price SM, Tellegen CL. 2009. No effect of inversion on attentional and affective processing of facial expressions. Emotion . 9: 248– 259. Google Scholar CrossRef Search ADS PubMed  Lundqvist D, Juth P, Öhman A. 2013. Using facial emotional stimuli in visual search experiments: the arousal factor explains contradictory results. Cogn Emot . 28: 1012– 1029. Google Scholar CrossRef Search ADS PubMed  Lundström JN, Boyle JA, Zatorre RJ, Jones-Gotman M. 2008. Functional neuronal processing of body odors differs from that of similar common odors. Cereb Cortex . 18: 1466– 1474. Google Scholar CrossRef Search ADS PubMed  MacKinnon DP, Fairchild AJ, Fritz MS. 2007. Mediation analysis. Annu Rev Psychol . 58: 593– 614. Google Scholar CrossRef Search ADS PubMed  Maddock RJ. 1999. The retrosplenial cortex and emotion: new insights from functional neuroimaging of the human brain. Trends Neurosci . 22: 310– 316. Google Scholar CrossRef Search ADS PubMed  Matsumoto D. 2002. Methodological requirements to test a possible in-group advantage in judging emotions across cultures: comment on Elfenbein and Ambady (2002) and evidence. Psychol Bull . 128: 236– 242. Google Scholar CrossRef Search ADS PubMed  Matsumoto D, Ekman P. 1988. Japanese and Caucasian facial expressions of emotion (JACFEE) and (JACNeuf) . San Francisco, CA: San Francisco State University. McFatter RM. 1979. The use of structural equation models in interpreting regression equations including suppressor and enhancer variables. Appl Psych Meas . 3: 123– 135. Google Scholar CrossRef Search ADS   Moessnang C, Finkelmeyer A, Vossen A, Schneider F, Habel U. 2011. Assessing implicit odor localization in humans using a cross-modal spatial cueing paradigm. PLoS One . 6: e29614. Google Scholar CrossRef Search ADS PubMed  Moss M, Hewitt S, Moss L, Wesnes K. 2008. Modulation of cognitive performance and mood by aromas of peppermint and ylang-ylang. Int J Neurosci . 118: 59– 77. Google Scholar CrossRef Search ADS PubMed  Moss M, Oliver L. 2012. Plasma 1,8-cineole correlates with cognitive performance following exposure to rosemary essential oil aroma. Ther Adv Psychopharmacol . 2: 103– 113. Google Scholar CrossRef Search ADS PubMed  Moss L, Rouse M, Wesnes KA, Moss M. 2010. Differential effects of the aromas of Salvia species on memory and mood. Hum Psychopharmacol . 25: 388– 396. Google Scholar CrossRef Search ADS PubMed  Niedenthal PM, Halberstadt JB, Margolin J, Innes-Ker ÅH. 2000. Emotional state and the detection of change in facial expression of emotion. Eur J Soc Psychol . 30: 211– 222. Google Scholar CrossRef Search ADS   Öhman A, Lundqvist D, Esteves F. 2001. The face in the crowd revisited: a threat advantage with schematic stimuli. J Pers Soc Psychol . 80: 381– 396. Google Scholar CrossRef Search ADS PubMed  Öhman A, Mineka S. 2001. Fears, phobias, and preparedness: toward an evolved module of fear and fear learning. Psychol Rev . 108: 483– 522. Google Scholar CrossRef Search ADS PubMed  Paulhus DL, Robins RW, Trzesniewski KH, Tracy JL. 2004. Two Replicable Suppressor Situations in Personality Research. Multivariate Behav Res . 39: 303– 328. Google Scholar CrossRef Search ADS PubMed  Phillips ML, Heining M. 2002. Neural correlates of emotion perception: From faces to taste. In: Rouby, C, Schaal B, Dubois D, Gervais R, Holley A, editors. Olfaction, taste, and cognition  (pp. 196– 208). Cambridge, UK: Cambridge University Press. Google Scholar CrossRef Search ADS   Pinkham AE, Griffin M, Baron R, Sasson NJ, Gur RC. 2010. The face in the crowd effect: anger superiority when using real faces and multiple identities. Emotion . 10: 141– 146. Google Scholar CrossRef Search ADS PubMed  Preacher KJ, Hayes AF. 2008. Asymptotic and resampling strategies for assessing and comparing indirect effects in multiple mediator models. Behav Res Methods . 40: 879– 891. Google Scholar CrossRef Search ADS PubMed  Rowe G, Hirsh JB, Anderson AK. 2007. Positive affect increases the breadth of attentional selection. Proc Natl Acad Sci USA . 105: 383– 388. Google Scholar CrossRef Search ADS   Savage RA, Lipp OV, Craig BM, Becker SI, Horstmann G. 2013. In search of the emotional face: anger versus happiness superiority in visual search. Emotion . 13: 758– 768. Google Scholar CrossRef Search ADS PubMed  Seubert J, Gregory KM, Chamberland J, Dessirier JM, Lundström JN. 2014. Odor valence linearly modulates attractiveness, but not age assessment, of invariant facial features in a memory-based rating task. PLoS One . 9: e98347. Google Scholar CrossRef Search ADS PubMed  Seubert J, Kellermann T, Loughead J, Boers F, Brensinger C, Schneider F, Habel U. 2010a. Processing of disgusted faces is facilitated by odor primes: a functional MRI study. Neuroimage . 53: 746– 756. Google Scholar CrossRef Search ADS   Seubert J, Loughead J, Kellermann T, Boers F, Brensinger CM, Habel U. 2010b. Multisensory integration of emotionally valenced olfactory-visual information in patients with schizophrenia and healthy controls. J Psychiatry Neurosci . 35: 185– 194. Google Scholar CrossRef Search ADS   Shrout PE, Bolger N. 2002. Mediation in experimental and nonexperimental studies: new procedures and recommendations. Psychol Methods . 7: 422– 445. Google Scholar CrossRef Search ADS PubMed  Sookoian S, Burgueño A, Gianotti TF, Marillet G, Pirola CJ. 2011. Odor perception between heterosexual partners: its association with depression, anxiety, and genetic variation in odorant receptor OR7D4. Biol Psychol . 86: 153– 157. Google Scholar CrossRef Search ADS PubMed  Spielberger CD, Gorsuch RL, Lushene R, Vagg P, Jacobs G. ( 1983). Manual for the State-Trait Anxiety Inventory . Palo Alto: Consulting Psychologists Press. Treisman AM, Gelade G. 1980. A feature-integration theory of attention. Cogn Psychol . 12: 97– 136. Google Scholar CrossRef Search ADS PubMed  Walla P. 2008. Olfaction and its dynamic influence on word and face processing: cross-modal integration. Prog Neurobiol . 84: 192– 209. Google Scholar CrossRef Search ADS PubMed  Watson D, Clark LA, Tellegen A. 1988. Development and validation of brief measures of positive and negative affect: the PANAS scales. J Pers Soc Psychol . 54: 1063– 1070. Google Scholar CrossRef Search ADS PubMed  Whalen PJ. 1998. Fear, vigilance, and ambiguity: Initial neuroimaging studies of the human amygdala. Curr Dir Psychol Sci . 7: 177– 188. Google Scholar CrossRef Search ADS   Zhou W, Chen D. 2009. Fear-related chemosignals modulate recognition of fear in ambiguous facial expressions. Psychol Sci . 20: 177– 183. Google Scholar CrossRef Search ADS PubMed  © The Author(s) 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Chemical Senses Oxford University Press

Sweet Emotion: The Role of Odor-induced Context in the Search Advantage for Happy Facial Expressions

Loading next page...
 
/lp/ou_press/sweet-emotion-the-role-of-odor-induced-context-in-the-search-advantage-75RYe0GjB8
Publisher
Oxford University Press
Copyright
© The Author(s) 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com
ISSN
0379-864X
eISSN
1464-3553
D.O.I.
10.1093/chemse/bjx081
Publisher site
See Article on Publisher Site

Abstract

Abstract The current study investigated the extent to which the concurrent presentation of pleasant and unpleasant odors could modulate the perceptual saliency of happy facial expressions in an emotional visual search task. Whilst a search advantage for happy faces was found in the no odor and unpleasant odor conditions, it was abolished under the pleasant odor condition. Furthermore, phasic properties of visual search performance revealed the malleable nature of this happiness advantage. Specifically, attention towards happy faces was optimized at the start of the visual search task for participants presented with pleasant odors, but diminished towards the end. This pattern was reversed for participants in the unpleasant odor condition. These patterns occur through the emotion-inducing capacities of odors and highlight the circumstances in which top-down factors can override perceptually salient facial features in emotional visual search. Introduction Odors have been shown to centrally interact with a range of biological and cognitive processes (Bensafi et al. 2002; Li et al. 2007; Moss et al. 2008; Moss et al. 2010), including their potent ability in unlocking our seemingly forgotten memories (Chu and Downes 2000). Their subjective ratings of pleasantness and unpleasantness affect not only how we feel (Black 2001), but also the quality of our emotional attachments with other humans (Sookoian et al. 2011). Their ability to evoke approach and avoidance affective reactions helps to mobilize an organism for “fight or flight” action through the neural interconnections between the olfactory receptors and the brain’s emotion processing hub, the amygdala, which is located only one synapse away (Boesveldt et al. 2010). In humans, further pathways from the amygdala feed into an intricate network implicated for the visual analysis of faces in the occipitotemporal cortex (inferior occipital gyrus and superior temporal sulcus; Adolphs 2002; Damjanovic et al. 2017), thus allowing for the rapid, emotional appraisal of the most socially important visual cue in our environment—the human face. Whilst the insula, amygdala, primary sensory, and orbitofrontal cortical regions in the human brain are involved in the perception of aversive stimuli in all 5 sensory modalities, a particularly elusive issue is whether olfaction and face perception influence each other in emotion specific ways (Phillips and Heining 2002). Leppänen and Hietanen (2003) provide one of the first attempts to systematically test the level of specificity in olfactory-visual processing of affective information. In their study, healthy participants were required to complete a forced-choice decision task categorizing facial expressions of happiness and disgust taken from the Ekman and Friesen (1976) database. Each facial expression was individually presented at central fixation on the computer screen and participants were required to identify, by button press, as quickly as possible the emotional expression portrayed. In a between-subjects design, some of the participants performed the task whilst they were exposed to a pleasant odor whilst another group of participants performed the task whilst they were exposed to an unpleasant odor. A third group of participants also performed the task under neutral (i.e., no odor) conditions. Whilst there was an overall advantage for categorizing happy facial expressions, this varied as a function of the odor context, such that the categorization of happy faces was facilitated in the context of pleasant odor relative to the no odor control condition, but impaired when presented with an unpleasant odor. The different odor contexts did not affect the processing of facial expressions of disgust. These findings suggest that whilst some facial expressions may be easier to recognize on the basis of unique low-level features, such as the brightness of a smile in happy facial expressions, their perception is nonetheless affected by the context in which it is encountered. In the case of Leppänen and Hietanen’s (2003) findings, the authors propose that the improved recognition of happy faces in the pleasant odor context is achieved by increasing the accessibility of positive emotions, which in turn enhances the perceptual processing of emotion-congruent aspects of the facial signal. Building on this important work, Leleu et al. (2015) discovered that some emotional expressions are affected more strongly by different odor contexts than others. For instance, facial expressions of anger and disgust were perceived correctly at lower stimulus intensities when presented in an aversive odor context (i.e., butyric acid) than in both the pleasant (i.e., strawberry) and no odor contexts. The perception of happiness was achieved at lower stimulus intensities when presented in the pleasant odor context than in the control and aversive contexts. However, participants were not significantly influenced by the different odor contexts in their perceptual judgments of fear and sadness. Whilst the facilitative effects found for happy facial expressions paired with the pleasant odor are consistent with the view that odor contexts can improve access to conceptual and/or emotional structures of affective stimuli at ambiguous low-stimulus intensities, such access may not always operate in a category-specific way. Thus, in some instances aversive odor contexts can facilitate the perception of some negative emotional expressions such as anger and disgust, but not others, such as sadness and fear. However, investigations focusing on the neural basis of this response facilitation by odorant primes have produced somewhat equivocal results. For example, Seubert et al. (2010a, 2010b) utilized a repeated measures design for the administration of pleasant, unpleasant, and neutral odors whilst participants categorized happy, disgusted, and neutral facial expressions. In contrast to Leppänen and Hietanen’s findings, the study by Seubert and colleagues found facilitated response times for facial expressions of disgust irrespective of the emotional valence of the odorant prime. For happy faces, the effects of the different odorant primes was less consistent: resulting in nonsignificant effects on reaction times in some instances (2010a), yet in others reaction times to happy faces were considerably impaired for both pleasant and unpleasant relative to the neutral odorant (2010b). This behavioral facilitation for facial expressions of disgust corresponded to neural modulations in the fusiform gyrus, middle frontal, and middle cingulated gyrus, with category specific modulations found for disgust faces-unpleasant odor pairings in the anterior insula. Thus, whether the odorant is pleasant or unpleasant, its effect on vision appears to be highly specialized; facilitating the perception of social cues that literally convey “bad taste” (i.e., disgust). Determining whether odor contexts can modulate emotion perception in a category-specific manner is likely to be influenced by a range of factors, ranging from the experimental design and the dependent variables of interest within a given study (e.g., accuracy, response times, self-report ratings, etc.,) to the ontological properties of the odorants themselves. For instance, Zhou and Chen (2009) created their odor contexts from sweat samples collected from participants whilst they watched video segments selected to induce fear and found that participants were more likely to judge an ambiguous facial expression as displaying fear when they were exposed to the chemosignal of fearful sweat, as compared to the control pad. Thus, the perception of low intensity fearful expressions appears to be susceptible to odor facilitation when the context is created from fear-related chemosensory stimuli with socio-communicative functions (i.e., body odors) rather than common odors. This may partly be due to differential processing between common odors and body odors, with research by Lundström et al. (2008) showing how body odors activate brain networks consisting of the posterior cingulate cortex, occipital gyrus, angular gyrus, and the anterior cingulate cortex—a network typically implicated in processing of emotional stimuli and the regulation of attentional resources (e.g., Botvinick et al. 1999; Maddock 1999), whilst deactivating other regions that have previously been linked to olfactory perception of common odors (e.g., piriform cortex and orbitofrontal cortex). Whilst a number of methodological issues could account for the discrepancy in results between the work of Leppänen and Hietanen (2003) and Seubert and colleagues (2010a, 2010b), a key issue that both studies agree on is the need to test the effects of odorant primes under complex face processing tasks. Asking participants to categorize a single facial expression presented at a fixed central location is not likely to exert a particularly demanding constraint on attentional resources, especially when response categories are explicitly primed (see also Leleu et al. 2015). Indeed, such experimental tasks result in ceiling levels of performance which are likely to mask any contextual effects provided by the odorant primes. To further clarify the role of odorant primes on face processing, it is important to investigate how they affect the spatial distribution of attentional resources. This is an important issue to address given that a considerable amount of our everyday attentional processing for facial expressions occurs in the context of surrounding facial expressions. The obvious ecological appeal to studying how we detect positive and negative facial expressions in a crowd of faces has been measured with the face in the crowd effect paradigm (FICE). Modelled on classic principles of visual search (e.g., Treisman and Gelade 1980), the FICE paradigm involves the presentation of a target face against an array of competing distracter faces on the computer screen. On some trials all the faces in the display show the same emotional expression, whereas in others, one face differs in emotion from the remaining faces in the crowd. Participants are instructed to discriminate between the “same” or “different” trials via a response key. The main independent variable of interest is the manipulation of the target face on these “different” display trials. Using response time and accuracy to detect the discrepant face in the display, some of the earliest FICE findings showed that participants were faster and more accurate in shifting their attentional resources towards the target face when it portrayed an angry facial expression than a happy one (e.g., Hansen and Hansen 1988; Öhman et al. 2001). Referred to in the FICE literature as the anger or the threat superiority effect (e.g., Fox and Damjanovic 2006; Pinkham et al. 2010), this detection advantage is often attributed to an evolutionarily-driven neural mechanism that enables rapid deployment of attentional resources to stimuli that signal immediate danger and attack in the observer’s visual environment (Öhman et al. 2001; Öhman and Mineka 2001) which can be heightened even further through threat-relevant training (e.g., Damjanovic et al. 2014). Thus, when attentional resources are relatively fixed, as in the categorization tasks used by Leppänen and Hietanen, happy faces show a processing advantage over negative expressions such as disgust. However, under greater attentional competition, as measured by the FICE, angry faces not happy ones yield the processing advantage. The predominance of threat superiority findings using FICE has however waned in recent years. This has been mainly due to an increasing number of studies documenting how happy faces, not angry ones, are detected faster and with greater accuracy, yielding what is referred to in the literature as the happiness advantage or the happiness superiority effect (e.g., Juth et al. 2005; Calvo and Nummenmaa 2008; Lipp et al. 2009; Damjanovic et al. 2010; Damjanovic and Santiago 2016). Although more and more studies are trying to increase the ecological validity of FICE tasks by using stimuli of photographic facial expressions from established databases rather than schematic drawings, it appears that most of the inconsistencies found in such studies can be accounted for by the presence or absence of low level facial features found across different database types (e.g., Savage et al. 2013). Recently referred to as the “teeth visibility hypothesis” (Horstmann et al. 2012), this perceptual account states that the presence of exposed teeth is a salient facial feature that drives the advantage of happy faces over angry faces, so much so that systematic manipulations of this facial component can reliably predict which target face is detected most efficiently: when happy facial expressions are conveyed with a “toothy grin” whilst angry faces are conveyed with a closed mouth, happy faces are detected more efficiently. Conversely, when angry facial expressions are conveyed with a “toothy snarl” whilst happy faces are conveyed with a closed smile, angry faces are detected more efficiently. However, further studies on this specific issue have questioned the extent to which a perceptual account can exclusively accommodate the detection advantage for happy facial expressions. Using computer-generated facial expressions of anger and happiness and embedding them within the FICE, Becker and colleagues (2011) reported more efficient detection times for happy face targets even when the amount of perceptual information was identical between angry and happy faces. Providing the following evolutionary and affective accounts, Becker et al. argue that happy facial expressions have evolved to be highly visually salient in our environment, as a means of alerting us to important social affiliation cues required to facilitate group membership and integration. Happy facial expressions therefore have become serviceable for the specific purpose of signaling friendship under a range of circumstances including their detection across long distances (e.g., Hager and Ekman 1979) and in instances when the emotion is less intensely expressed (Becker et al. 2011; Becker et al. 2012). As such, happy faces are encountered in our lifetime with greater frequency in our social environment relative to negative emotions (e.g., Bond and Siddle 1996; Whalen 1998), and in turn biases our expectancy for positive outcomes over negative ones (e.g., Diener and Diener 1996; Chipchase and Chapman 2007). A direct consequence of such frequency effects is that positively laden affective information becomes preferentially processed over negative information. However, when the competing negative affect is overly arousing, any attentional bias towards happy faces may diminish, opening up the prioritization of threat-specific cues, such as angry faces, instead. To summarize, categorization tasks focus on emotion perception under fixed attentional demands whereas FICE tasks are mainly concerned with how attention is distributed across several facial expressions. However, both methodologies have attracted considerable theoretical debate in terms of whether the processing of facial expressions of emotion can be more appropriately accounted for by perceptual-based explanations or affective ones. The current study makes a new contribution to this area by using the contextual cues created by odors which have been extensively applied in categorization tasks, but never included in tasks measuring spatial attention performance with emotional faces. The main aim of the study is to investigate for the first time the effects of different odorant primes on the happiness superiority effect using the FICE task. The present study As noted in the above review, the happiness superiority effect can to a large extent be determined by the type of stimuli used in FICE display trials (e.g., Juth et al. 2005; Becker et al. 2011; Becker et al. 2012). The FICE task used in the current study was selected to satisfy 2 important aims: to elicit a consistent happiness superiority effect within the participant sample recruited for the study and to be sufficiently complex in task demands to allow for the different odorant primes to take effect (e.g., Leppänen and Hietanen 2003; Seubert et al. 2010a, 2010b). The FICE task developed in some of our earlier work satisfies these criteria, demonstrating a robust happiness superiority effect by native English-speaking Caucasian participants across 3 experiments, although for some variants of the FICE task, the detection of happy face targets was easier than others. The current study used the “crowd” variant of Damjanovic et al.’s FICE task, using angry, happy and neutral face stimuli taken from the Caucasian set of the Matsumoto and Ekman’s Japanese and Caucasian Facial Expressions of Emotion (JACFEE) database. Developed in 1988, the database was validated by using the Facial Action Coding System (FACS), a technique that enables the objective measurement of facial muscle innervations specific to the emotion portrayed (Ekman et al. 2002). This allowed the facial expressions to be carefully matched for signal clarity and intensity across the different emotional categories (Matsumoto 2002). The happy facial expressions in Matsumoto and Ekman’s database were posed with “toothy grins”, whilst all of the angry face exemplars were posed with a downward shut mouth, thus the happiness superiority effect found in Damjanovic et al.’s study with their Caucasian participants could be accounted for in terms of the teeth visibility hypothesis (Horstmann et al. 2012). The key research question addressed in the current study is whether such a perceptual based explanation of the happiness superiority effect measured by FICE can operate independently of an affectively valenced environment. Specifically, we hypothesized that if the underlying mechanism of the happiness superiority effect is a perceptual one, the effect should remain stable across the different odor contexts (Juth et al. 2005; Calvo and Nummenmaa 2008; Calvo and Marrero 2009; Damjanovic et al. 2010; Becker et al. 2011). This hypothesis was addressed by comparing participants’ FICE performance in a no odor (i.e., control) group with participants who performed the task under different affectively valenced environments created by the concurrent presentation of pleasant or unpleasant odorant primes. The experiment used a between-subjects design and long-term odorant exposure in order to assess whether exposure to the odorants influenced the emotional state of the participant during the FICE task. Whilst obtaining faster detection times for happy face targets in the pleasant odor condition would be consistent with the findings obtained in Leppänen and Hietanen’s (2003) work, the exact mechanism for such odor effects remains elusive. On the one hand, Leppänen and Hietanen suggest that pleasant odorants may operate in a mood-congruent manner, by activating positive emotions within the participants, which in turn facilitates access to conceptual knowledge about the target emotion (e.g., smiling faces), yet whether this cognitive facilitation is achieved independently of any emotional change within the participant remains unknown. Indeed, the majority of studies that have examined the emotion inducing properties of different odorants have found significant changes within the participant across a variety of measures (Krippl 2003). For example, at a physiological level, the affective properties of odors have been shown to exert a direct influence on a participant’s level of autonomic nervous system activity, such that an increase in an odor’s subjective pleasantness leads to a decrease in the participant’s heart rate (Bensafi et al. 2002). Furthermore, exposure to different types of odorants such as ylang-ylang, have successfully been found to increase self-reported levels of calmness and reduce anxiety (Moss et al. 2008; Moss et al. 2010; Moss and Oliver 2012). Based on these observations, it is highly plausible for the facilitative effects observed in Leppänen and Hietanen’s study to have occurred as a consequence of a change in the emotional state of the participant. Therefore, it would be important to establish whether the effects of odors on cognitive performance in the FICE task can occur independently of the emotional state of the observer. This will be achieved in the current study by administering a measure of self-reported anxiety, the State-Trait Anxiety Inventory (STAI; Spielberger et al. 1983) in a pre versus post-test measure design. This type of measure has been applied effectively in previous research with FICE tasks (e.g., Damjanovic et al. 2014) and utilized in the perception of chemosensory stimuli (Zhou and Chen 2009) to assess self-reported anxiety in participants. As such, utilizing the STAI in a pre versus post-test design will allow us to assess if the odorants in the current study influence anxiety and if so, what are the implications of such modulations on attentional performance. Specifically, we hypothesized that pleasant odorants would reduce self-reported measures of anxiety, whilst unpleasant odorants would increase self-reported measures of anxiety. Furthermore, if these hypotheses are confirmed in our analyses, we will examine the extent to which such changes in self-reported anxiety mediates the attentional performance in detecting happy facial expressions. Finally, given that differences in experimental design and odor exposure intervals are both likely to influence mood-inducing capacities of odors and other associated affective states (see Seubert et al. 2010b), a further consideration for the present study is to determine whether odors exert a specific time course on emotion perception. This is because the olfactory modality is particularly vulnerable to habituation; with repeated or prolonged exposure to an odorant stimulus, neural sensitivity is reduced, consequently reducing its saliency and priming potential (Dalton 2000). For instance, Moessnang et al. (2011) showed that participants’ spatial attention to locate a target shape presented on the same side as an odorant cue was initially slower at the start of the experiment, but then disappeared over the course of the experiment. Performing a similar time course split on reaction time will help to establish to what extent the search for happy faces remains stable over the course of the experiment. Method Participants A total of 54 Undergraduate and Postgraduate students from the University of Chester were randomly allocated in equal groups to the control (female = 13, male = 5; mean age: 23.72 years, range: 19–39 years), pleasant odor (female = 15, male = 3; mean age: 21.94 years, range: 18–40 years), and the unpleasant odor condition (female = 15, male = 3; 23.11 years, range: 18–48 years). Ethics statement The work with human participants complies with the Declaration of Helsinki for Medical Research involving Human Subjects. The study was also approved by the Department of Psychology Ethics Committee at the University of Chester, United Kingdom. All participants gave written informed consent and were paid £5.00 for participation. Participants self-reported that they had normal to normal-to-corrected vision, normal sense of smell, and no nasal or food allergies and were not experiencing any respiratory problems. Female participants who were pregnant or thought that they might be pregnant were excluded from participating in the study to minimize the risk of nausea. Once each participant’s testing date and time was confirmed, they were reminded to restrict some habits that could affect their ability to smell, such as smoking, drinking coffee and using scented products, on the day of testing. They were also reminded of these restrictions 24 h before their day of testing. Stimuli and apparatus Based on previous research by Leppänen and Hietanen (2003) that utilized an unbalanced design towards positive odors, we selected strawberry (contains: ALDEHYDE C16 (STRAWBERRY PURE), METHYL CINNAMATE, alpha iso methyl ionone, amyl cinnamic aldehyde), vanilla (contains: VANILLIN, limonene, coumarin, ETHYL MALTOL, Tonalid), and orange zest odors (contains: Linalyl Acetate, citral, limonene, linalool) for the pleasant odor condition, and for the unpleasant condition we selected a fish odor (contains: PINE TAR OIL, Alpha-Cedrene) (Boesveldt et al. 2010). All odorants were manufactured and supplied by Dale Air™ in the UK. For the main experiment, the odors were supplied in aerosol form and distributed by a purpose-built dispenser supplied by Dale Air™ positioned 2 m above floor level. The odor release mechanism was set to 20-min intervals. Cotton wads absorbed with the liquid form of the odors and presented in containers were used to collect ratings of arousal and pleasantness in a separate rating study and as part of the odor selection stage of the main experiment. Odor rating study A separate group of 36 student participants from the same population and matched for male-female split as those recruited from the main experiment were randomly allocated in equal groups to the pleasant (female = 15, male = 3; mean age: 30 years, range: 18–53 years) and unpleasant odor conditions (female = 15, male = 3; 22 years, range: 18–42 years). Each participant in the pleasant odor group was presented with 3 individual containers containing the odors and asked to rate each container on pleasantness and arousal using a 5-point scale. Thus, all participants in the pleasant odor group smelled all of the pleasant odors. As per Leppänen and Hietanen’s (2003) pleasantness ratings, participants were instructed to sniff each container and evaluate it on a 5-point Likert scale ranging from 1 (extremely unpleasant) to 3 (neutral) to 5 (extremely pleasant). Measures of arousal were obtained by adapting the instructions and response categories used by Bensafi et al. (2002, p. 705), whereby participants were instructed to “Please judge your feeling when you smelled the odorant by circling the relevant number between 1 (not at all arousing) to 3 (neutral) to 5 (extremely arousing)”. The mean value provided for each odorant for both pleasantness and arousal measures were used to obtain a further measure of perceived intensity by conducting a series of one-sample t-tests with the mid-point value as the hypothetical neutral value. Measures of pleasantness A 1-way independent groups ANOVA revealed a significant difference for pleasantness ratings for strawberry (M = 4.28, SD = 0.96), vanilla (M = 4.28, SD = 0.75), orange zest (M = 3.94, SD = 0.73), and the fish odor (M = 2.17, SD = 1.10), F (3, 68) = 22.94, MSE = .80, P < 0.001, ηp2 = 0.50. Planned comparison t-tests showed that whilst there were no significant differences in pleasantness ratings between strawberry, vanilla and orange (P > 0.05), each pleasant odor however was associated with significantly higher ratings than the fish odor; strawberry, t (34) = 6.15, P < 0.001, d = 2.05, vanilla, t (34) = 6.73, P < 0.001, d = 2.27, orange zest t (34) = 5.73, P < 0.001, d = 1.92. Furthermore, 1 sample t-tests confirmed that both pleasant and unpleasant fish odor ratings differed significantly from the neutral mid-point (P < 0.001), with pleasant odors being rated significantly more towards the pleasant end of the scale and the fish odor being rated significantly more towards the unpleasant end of the scale. Measures of arousal Strawberry (M = 3.44, SD = 1.20), vanilla (M = 3.11, SD = 1.28), orange zest (M = 2.94, SD = 1.47), and the fish odor (M = 2.89, SD = 1.32), did not differ significantly from each other in terms of perceived arousal, F (3, 68) = 0.64, MSE = 1.75, p = 0.590, ηp2 = 0.03 or from the neutral mid-point. Thus, the odors selected for the main experiment differed significantly in terms of their affective valence (pleasant vs. unpleasant), but were not confounded by differences in stimulus arousal. Main experiment Participants in the main experiment were required to rate each odor for perceived pleasantness (1 = extremely unpleasant to 5 = extremely pleasant Likert scale). Each participant in the pleasant odor group was presented with 3 individual containers containing the pleasant odors, whilst participants in the unpleasant odor condition were given the fish odor to rate. Thus, all participants in the pleasant odor group smelled all of the pleasant odors. A 1-way independent groups ANOVA revealed significant differences in pre-experimental ratings for strawberry (M = 4.44, SD = 0.78), vanilla (M = 4.17, SD = 0.62), and orange zest (M = 4.11, SD = 0.76), the overall mean for the selected pleasant odor (M = 4.33, SD = 0.59) and the fish odor (M = 2.0, SD = 0.59), F (4, 85) = 41.20, MSE = 0.46, P < 0.001, ηp2 = 0.66. Mirroring the pattern of results found in the odor rating study, the pleasant odors did not differ significantly from each other (P > 0.05), but each pleasant odor was associated with significantly higher ratings than the fish odor; strawberry, t (34) = 10.55, P < 0.001, d = 3.54, vanilla, t (34) = 10.72, P < 0.001, d = 3.56, orange zest, t (34) = 9.30, P < 0.001, d = 3.10, and the overall mean for the selected (see procedure) odor, t (34) = 11.78, P < 0.001, d = 3.95. Furthermore, the pleasant odors and the unpleasant fish odor ratings differed significantly from the neutral mid-point (P < 0.001). To establish that these differences in pleasantness ratings between the 2 odor conditions were significant at the end of the experiment as well as at the beginning, further between groups comparisons were conducted, as per Leppänen and Hietanen (2003), by using a 20-cm visual analogue scale with the word unpleasant at the left end, neutral in the middle, and pleasant at the right for participants to evaluate the pleasantness of the odor in the room. Responses to the odor evaluations were recorded from 0 to 10 for pleasant responses and from 0 to −10 for unpleasant responses. Participants in the pleasant odor condition rated the odor as significantly more pleasant (M = 6.81, SD = 2.12) than participants with the fish odor (M = −5.59, SD = 4.53), t(34) = 10.52, P < 0.001, d = 3.72. Both ratings differed significantly from the mid-point (i.e., neutral) as revealed by 1 sample t-tests for fish, t(17) = −5.23, p < .001, d = 1.23 and the selection of odors, t(17) = 13.64, P < 0.001, d = 3.21, respectively. Thus, the unpleasant and pleasant evaluations associated with fish and the pleasant odors at the start of the experiment were maintained towards the end of the experiment. Facial expression stimuli Four angry (E1–E4), 4 happy (E33–E36), and 8 neutral (N6, N8, N11, N13, N17, N22, N26, N27) faces were selected from the Caucasian set of Matsumoto and Ekman’s (1988) database. Adobe Photoshop converted each color image to grayscale and applied an oval template (125 pixels wide by 168 pixels high) to remove external features (e.g., hair, ears, neckline). Mean luminance and contrast were matched for all faces such that each face generated an intensity value of 190. Stimulus presentation and data recording was obtained through SuperLab 4.0, using a Mac G4 OSX computer. Design The happiness superiority effect was measured using reaction time (RTs) recorded from the onset of each visual search display to participant response and error rates on “different” display trials (Damjanovic et al. 2010). Participants were randomly allocated to 1 of the 3 groups: control, pleasant or unpleasant. Type of target (angry and happy) and type of distracter (neutral and emotional) were administered as repeated measures variables. The anxiety inducing properties of the odors was established by comparing state anxiety scores pre and post odor exposure. Participants in the control condition were also required to provide self-report measures of their state anxiety, once before completing the visual search task and immediately after its completion. Procedure The procedure involved several measures administered in the following order: rating of the odor(s), state anxiety, visual search task, rating of the odor, and state anxiety. The odor rating measures were not applicable to participants in the control (i.e., no odor condition). For participants in the pleasant odor condition, the odor that they rated the highest for pleasantness was selected for the visual search task, whereas for participants in the unpleasant odor condition it was the fish odor. Thus, participants in the odor conditions were exposed to one odorant for the visual search task. Participants then completed the state (S) component of the STAI (Spielberger et al. 1983) as an index of their baseline anxiety. The visual search task was taken from Experiment 1 in Damjanovic et al.’s (2010) study. Briefly, this consisted of same display trials of 4 different individuals displaying the same emotional expression (i.e., all angry, all happy or all neutral). There were 4 different display trials: 1 angry, 3 neutral; 1 angry, 3 happy, 1 happy, 3 neutral; and 1 happy, 3 angry. The visual search experiment consisted of 96 same-display trials (32 angry, 32 happy, 32 neutral expressions) presented randomly with 128 different-display trials (32 in each of the 4 conditions). A fully counterbalanced design in which each poser provides each expression was not possible to implement in the current study due to the fact that each poser only contributed one facial expression of emotion and one neutral expression to the database (Matsumoto and Ekman 1988; Matsumoto 2002). Each trial began with a fixation cross in the centre of the screen for 500 ms followed by a display of 4 faces surrounding the central fixation point for 800 ms. The 4 faces were arranged in an imaginary circle, occupying top, right, bottom, and left locations on the computer screen, with a fixation cross at the centre viewed at a distance of 60cm. Each face subtended a visual angle of 3.1° horizontally by 4.1° vertically. The centre of each face was 6.2° of visual angle from fixation. The inter trial interval was set to 2000 ms. Participants were instructed to respond as quickly and as accurately as possible whether the 4 faces in the display showed the “same” emotion or whether one was “different” in emotion from the remaining 3 faces by pressing the “x” and “.” key on the keyboard. Response mapping was reversed for half the participants, with feedback in the form of a 1000 ms beep being provided on incorrect trials. Although participants performed the visual search task without a break, our previous work with these tasks has indicated that this does not necessarily induce severe fatigue effects. After the visual search task, participants in the experimental groups used a 20-cm visual analogue scale with the word unpleasant at the left end, neutral in the middle, and pleasant at the right to evaluate the pleasantness of the odor in the room as a post-experiment rating measure. This change in rating method from a 5-point Likert scale to a visual analogue scale follows similar procedural approaches (e.g., Leppänen and Hietanen 2003) and was implemented in the current study to minimize the impact of participants’ responses styles on their odor ratings. Finally, participants were provided with the STAI (S) component to complete. Participants were required to complete the STAI (S) after completing the odor rating in order to replicate the administration of the rating scales used in Leppänen and Hietanen’s (2003) procedure. To assess whether asking participants to give a positive or negative rating for an environmental factor may subsequently raise awareness of this factor and influence their STAI (S) scores, we correlated participants’ pleasantness ratings with their post-experiment STAI(S) score. The relationship between the 2 measures was weak and non-significant, r = 0.07, P = 0.689, providing little evidence to suggest that rating an odorant’s pleasantness is significantly associated with self-reported ratings of state anxiety. Once the STAI was completed, participants were debriefed and thanked for their time. Results Anxiety-inducing properties of odors To test our specific hypotheses that the pleasant odorant would result in a decrease in self-reported anxiety, the unpleasant odorant an increase in self-reported anxiety, and the control condition resulting in a non-significant change, a 3 (group: control, pleasant or unpleasant) × 2 (time: before and after) mixed ANOVA with repeated measures on the last factor was applied to the participants’ STAI-S scores (see Table 1). There was no significant effect of group, F (2, 51) = 0.04, MSE = 132.02, P = 0.962, ηp2 = 0.00 or time, F (1, 51) = 0.02, MSE = 26.77, P = 0.882, ηp2 = 0.00 However, the group × time interaction was significant, F (2, 51) = 7.05, MSE = 26.77, P = 0.002, ηp2 = 0.22. There were no significant group differences in state anxiety at baseline, F (2, 102) = 1.35, MSE = 79.39, P = 0.265, ηp2 = 0.03 or post-test, F (2, 102) = 1.10, MSE = 79.39, P = 0.339, ηp2 = 0.02. However, self-reported anxiety levels changed within each group, such that towards the end of the experiment, anxiety levels significantly decreased in participants exposed to the pleasant odors F (1, 51) = 8.59, MSE = 26.77, P = 0.005, ηp2 = 0.14, but increased for participants exposed to the unpleasant odor, F (1, 51) = 4.25, MSE = 26.77, P < 0.044, ηp2 = 0.08. Pre versus post-test changes in anxiety did not differ significantly for participants in the control group, F (1, 51) = 1.27, P = 0.264, ηp2 = 0.02. Table 1. Mean STAI (S) pre and post test measures as a function of odor context.     Context  Measure  Control  Pleasant  Unpleasant    Mean  (SE)  Mean  (SE)  Mean  (SE)  Pre-STAI (S)  33.56  (2.01)  36.78  (2.58)  32.00  (2.02)  Post-STAI (S)  35.50  (2.12)  31.72  (1.57)  35.56  (2.17)      Context  Measure  Control  Pleasant  Unpleasant    Mean  (SE)  Mean  (SE)  Mean  (SE)  Pre-STAI (S)  33.56  (2.01)  36.78  (2.58)  32.00  (2.02)  Post-STAI (S)  35.50  (2.12)  31.72  (1.57)  35.56  (2.17)  Standard errors are presented in parentheses. View Large Visual search performance As per Damjanovic et al. (2010), only performance on discrepant trials were examined. Reaction time (RT) for correct responses on different display trials were filtered (<100 ms or > 2000 ms) for analysis. To test the hypothesis that the underlying mechanism of the happiness superiority effect is a perceptual one a 3 (group: control, pleasant, or unpleasant) × 2 (target: angry and happy) × 2 (distracter: emotional and neutral) mixed ANOVA with repeated measures on the last 2 factors was conducted. The main effect of group was not significant, F (2, 51) = .40, MSE = 196657.53, P = 0.676, ηp2 = 0.02 . The initial results replicated a happiness superiority effect, F (1, 51) = 30.46, MSE = 3871.58, P < 0.001, ηp2 = 0.37, and participants were faster to detect a target when it was surrounded by emotional than neutral distracter faces, F (1, 51) = 73.58, MSE = 3974.857, P < 0.001, ηp2 = 0.59. Type of target and type of distracter interacted significantly with each other F (1, 51) = 61.48, MSE = 2881.73, P < 0.001, ηp2 = 0.55, with the happiness superiority effect occurring with neutral distracters F (1, 102) = 86.51, MSE = 3376.66, P < 0.001, ηp2 = 0.46, but not with emotional distracters, F (1, 102) = 0.89, MSE = 3376.66, P = 0.348, ηp2 = 0.01. Angry face targets were found faster overall when they were surrounded by emotional distracters (i.e., happy faces) than when surrounded by neutral distracters, F (1, 102) = 134.90, MSE = 3428.29, P < 0.001, ηp2 = 0.60, whereas overall response times for happy face targets was equivalent for emotional and neutral distractors, F (1, 102) = 2.10, MSE = 3428.29, P = 0.151, ηp2 = 0.02. The effect of target interacted significantly with group F (2, 51) = 4.58, MSE = 3871.58, P = 0.015, ηp2 = 0.15, producing the happiness superiority effect in the control F (1, 51) = 24.83, MSE = 3871.58, P < 0.001, ηp2 = 0.33 (See Figure 1A) and unpleasant groups F (1, 51) = 14.11, MSE = 3871.58, P < 0.001, ηp2 = 0.22 (See Figure 1C), but is eliminated in the pleasant group, F (1, 51) = 0.67, MSE = 3871.58, p = .416, ηp2 = .01 (See Figure 1B). The 3-way interaction between group × target × distracter did not reach significance, F (2, 51) = 0.45, MSE = 2881.73, p = .644, ηp2 = 0.02. Figure 1. View largeDownload slide Visual search data for different displays. The left panels show mean reaction time and the right panels show mean error rates to detect the angry and happy facial expression targets against emotional (E) and neutral (N) distracters for the control (A), pleasant odor (B), and unpleasant odor (C) conditions. Error bars correspond to the standard errors of the mean of each condition individually. Figure 1. View largeDownload slide Visual search data for different displays. The left panels show mean reaction time and the right panels show mean error rates to detect the angry and happy facial expression targets against emotional (E) and neutral (N) distracters for the control (A), pleasant odor (B), and unpleasant odor (C) conditions. Error bars correspond to the standard errors of the mean of each condition individually. Analysis of error rates revealed significantly lower error rates associated with happy targets compared to angry targets, F (1, 51) = 167.60, MSE = 118.18, P < 0.001, ηp2 = 0.77 (See Figure 1) and with emotional distracters compared with neutral ones, F (1, 51) = 135.33, MSE = 103.66, P < 0.001, ηp2 = 0.73. A significant target × distracter interaction F (1, 51) = 86.73, MSE = 132.72, P < 0.001, ηp2 = .63 revealed low error rates for happy targets with both emotional (F (1, 102) = 245.18, MSE = 125.45, P < 0.001, ηp2 = 0.71) and neutral distracters F (1, 102) = 4.46, MSE = 125.45, P = 0.037, ηp2 = 0.04. For angry targets, detection accuracy was considerably better with emotional distracters F (1, 102) = 215.56, MSE = 118.19, P < 0.001, ηp2 = 0.68, but there was no significant effect of surrounding distracter context on error rates for happy targets, F (1, 102) = 0.53, MSE = 118.19, P = 0.470, ηp2 = 0.01. The 3-way interaction between group × target × distracter did not reach significance, F (2, 51) = 0.61, MSE = 132.72, P = 0.549, ηp2 = 0.02. These results show that whilst there is an overall search advantage favoring happy facial expressions, this advantage is modulated by the affectively valenced environmental cues. Furthermore, the presence of the group × target interaction on response times, indicates that such cues exert a stronger effect on processing speed than on accuracy. The emotionality of the competing distracter faces produces similar effects on both search and accuracy measures. Stability of the search advantage for happy faces Habituation effects were investigated by calculating a single computation for each participant for each distracter context, the happiness superiority index (HSI). This calculation involved the mean RTs for all angry targets minus mean RTs for all happy targets, for the first and last 25% of trials for each participant, with a positive value indicating faster detection times for happy faces. A 3 (group: control, pleasant, or unpleasant) × 2 (distracter: emotional and neutral) × 2 (phase: first quarter and last quarter) mixed ANOVA with repeated measures on the last 2 factors (see Figure 2) revealed no significant effects of group F (2, 51) = 0.15, MSE = 32 876.65, P = 0.859, ηp2 = .01 or phase F (1, 51) = 0.10, MSE = 24 868.75, p = .752, ηp2 = 0.00. Greater levels of happiness superiority were observed with neutral relative to emotional distracters F (1, 51) = 53.26, MSE = 20 381.67, P < 0.001, ηp2 = 0.51 (See Figure 2). Figure 2. View largeDownload slide Superiority index for happiness for the first and last 25% of search trials in milliseconds (ms) with emotional and neutral distractors across the 3 experimental contexts. A positive score indicates faster detection of happy over angry face targets on different display trials. Error bars correspond to the standard errors of the mean of each condition individually. Figure 2. View largeDownload slide Superiority index for happiness for the first and last 25% of search trials in milliseconds (ms) with emotional and neutral distractors across the 3 experimental contexts. A positive score indicates faster detection of happy over angry face targets on different display trials. Error bars correspond to the standard errors of the mean of each condition individually. The only significant higher order interaction to emerge from the analyses was for group × phase, F (2, 51) = 6.05, MSE = 24 868.75, P = 0.004, ηp2 = 0.19, with simple main effects revealing higher levels of happiness superiority in the last quarter of the experiment, F (2, 102) = 3.62, MSE = 28 872.02, P = 0.030, ηp2 = .07, an effect which was limited to the unpleasant odor group (Tukey P < 0.05). For participants in the pleasant odor group, the magnitude of the happiness superiority effect was stronger at the start of the visual search task than towards the end, F (1, 51) = 4.44, MSE = 24 868.75, P = 0.040, ηp2 = 0.08. This pattern was reversed for participants in the unpleasant group, F (1, 51) = 7.74, MSE = 24 868.75, P = 0.01, ηp2 = 0.13. The happiness superiority effect did not differ between the start and end of the visual search task for participants in the control group, F (1, 51) = 0.02, MSE = 24 868.75, P = 0.901, ηp2 = 0.00. The role of self-reported anxiety in mediating the impact of scent pleasantness on the happiness superiority effect Given that our hypotheses relating to the anxiety modulating properties of the odorants were supported, the following analyses investigated whether the significant changes in self-reported anxiety reported in the 2 odor groups (see Table 1) could impact on the HSI at the start and towards the end of the visual search task (see Figure 2). To achieve this, a change in state anxiety variable was computed (STAI-CHANGE: STAI-Safter – STAI-Sbefore) with a negative value indicating a decrease in anxiety, and a positive value indicating an increase in anxiety as a function of odor exposure. Focusing on the significant effects found with neutral distracters, odor type (coded: 0 = pleasant, 1 = unpleasant) correlated positively with changes in self-reported anxiety, rpb = 0.54, P < 0.001, such that the unpleasant odor was strongly associated with increased levels of self-reported anxiety. Furthermore, the pleasantness of odor type almost yielded a significant relationship with search times for the HSI at the start of the experiment, resulting in a greater HSI with pleasant than unpleasant odors, rpb = −0.31, P = 0.065. The relationship between changes in anxiety and HSI, was negligible, r = −0.02, P = 0.913. In contrast to the patterns observed at the start, as participants approached the end of the visual search task, odor pleasantness was found to significantly correlate with HSI, such that the unpleasant odor was moderately associated with higher levels of happiness detection, rpb = 0.37, P = 0.026. The relationship between changes in anxiety and HSI performance was weak and not significant, rpb = 0.28, P = 0.104. Given the small sample sizes (Shrout and Bolger 2002; Preacher and Hayes 2008), 2 separate bootstrapped hierarchical regressions for search times at the start and at the end of the visual search task were performed to test the degree to which odor type and changes in state anxiety could predict the magnitude of the HSI. These results are summarized in Table 2. At the start of the experiment, type of odor (block 1) accounted for 9.7% of the variation in detecting happy faces, F (1, 34) = 3.64, MSE = 21 702.49, P = 0.065. The inclusion of changes in state anxiety in block 2, accounted for a further 3.1%, but this did not significantly improve the ability of the model to predict happiness detection performance, F (2, 33) = 2.41, MSE = 21 594.90, P = 0.105. Inspecting the bootstrapped unstandardized b-values, odor type in block 1 approached significance (P = 0.058), but gained full predictive value when included with changes in state anxiety in block 2 (P = 0.027), whilst changes in the state anxiety failed to gain any significant value in predicting happiness detection (P = 0.257). The configuration of these effects is consistent with classical suppression found in regression analyses (e.g., Horst 1941; McFatter 1979;,Paulhus et al. 2004; MacKinnon et al. 2007) and demonstrates that knowing how a participant responds emotionally to the odorant during initial stages of exposure significantly improves the detection of emotional facial cues. In the case of the current study, reductions in anxiety created by exposure to pleasant odors facilitate the detection of happy face targets. Table 2. Bootstrapped hierarchical mediation methods on the effect of odor type on the happiness superiority effect (block 1) as mediated through changes in self-reported state anxiety (block 2) for the first quarter and last quarter of the visual search trials.   First quarter  Last quarter    B  SE B  B  SE B  Block 1   Constant  164.31  39.22  65.15  37.88   Odor type  −93.64  47.87  122.34*  53.08  Block 2   Constant  183.39  41.15  75.77  44.20   Odor type  −127.40*  51.10  103.55  67.96   STAI-CHANGE  3.77  3.59  2.10  3.61    First quarter  Last quarter    B  SE B  B  SE B  Block 1   Constant  164.31  39.22  65.15  37.88   Odor type  −93.64  47.87  122.34*  53.08  Block 2   Constant  183.39  41.15  75.77  44.20   Odor type  −127.40*  51.10  103.55  67.96   STAI-CHANGE  3.77  3.59  2.10  3.61  Note: Estimates are unstandardized; odor type is coded 0 = pleasant, 1 = unpleasant; 1000 bootstrap samples; * <.05. View Large In a second hierarchical regression conducted on performance towards the end of the experiment, type of odor (block 1) significantly predicted happiness detection, accounting for 13.8% of the variance in performance, F (1, 34) = 5.44, MSE = 24 754.00, P = 0.026, but adding change in anxiety scores (block 2) only increased the variance accounted for by a non-significant 0.8%, F (2,33) = 2.82, MSE = 25 267.18, P = 0.074. Inspecting the bootstrapped unstandardized b-values, odor type in block 1 significantly predicted the detection of happiness, such that the presentation of unpleasant odors improved the magnitude of the HSI (P = 0.028). However, when changes in state anxiety were controlled for (block 2), neither odor type (P = 0.14) nor anxiety change (P = 0.531) were able to significantly predict HSI performance. In this instance, adding changes in anxiety to the model resulted in a redundancy situation (Paulhus et al. 2004), accounting for less than 1% of performance. Thus, towards the end of the experiment, the detection of happy faces is based entirely on the odorant and is independent of any emotional changes that occur within the participant as a result of the odorant prime. In this instance, the role of the odorant prime is reversed, such that participants who were exposed to the unpleasant odor showed improved detection of happy faces relative to participants who were exposed to the pleasant odors. Discussion In the present study, we investigated the influence of olfactory environmental context on the perception of facial expressions of emotion conveying happiness and anger. Whilst some of the earlier work in this area appears to suggest category-specific facilitative effects of pleasant odorant primes on the processing of happy facial expressions (e.g., Leppänen and Hietanen 2003), very little is known about the generalizability of these findings under more complex visual processing tasks or the possible underlying mechanism that may support the cross-modal integration of affective cues. To address these issues, we used the FICE task to examine whether the concurrent presentation of pleasant and unpleasant odorant cues affects the spatial distribution of attentional resources towards happy face targets, and also compared self-reported measures of anxiety to evaluate the extent to which these odors might alter the emotional state of the participant. In the control condition, participants were significantly faster and less error prone to detect a discrepant happy face target in a crowd of competing distracter faces, a finding which is consistent with earlier work with this particular version of the FICE task (Damjanovic et al. 2010). However, whilst this overall search advantage for happy faces was observed in participants exposed to the unpleasant odor, it was abolished for participants exposed to the pleasant odors. An initial consideration of these patterns may at first appear difficult to reconcile with the category-specific facilitative effects reported in Leppänen and Hietanen’s emotion categorization study. However, further analyses taking on board the recommendations made in Moessnang et al.’s (2011) work, reveals 2 important characteristics on the time course effects of odors on the perception of happy facial expressions: (i) pleasant odors facilitate the detection of happiness, but the benefits are short-lived, and (ii) unpleasant odors help in the detection of happy faces, but only towards the end of the visual search task. The reversal of such phasic optimization effects from the 2 different odorants could be accounted for in terms of the emotional state of the participant (Leppänen and Hietanen 2003). The significant differences between the pleasantness ratings for the odors were maintained at the start of the experiment as well as towards the end, yet were matched for their overall arousal, thus ruling out potential differences between the 2 experimental contexts based in terms of differences in arousal. Furthermore, the odors produced differential effects on the participants’ levels of self-reported anxiety as measured by the STAI, such that individuals who were exposed to the pleasant odor showed a reduction in their overall anxiety, whilst participants exposed to the unpleasant odor demonstrated an increase in their anxiety. Previous considerations of the participant’s emotional state on facial expression processing have either been made indirectly, in the form of potential mood congruency effects as per Leppänen and Hietanen’s (2003) work, or directly by collecting ratings from participants about the extent of their current experiences of happiness and sadness (e.g., Niedenthal et al. 2000). For example, mood induction techniques leading to higher levels of self-reported happiness or sadness resulted in participants perceiving the mood congruent facial expressions for a longer time than control participants. Furthermore, identifying the beneficial effects of certain odorants has predominantly focused on their anxiety reducing capacities, rather than specifically identifying whether they improve an individual’s own happiness (e.g., Moss et al. 2008; Zhou and Chen 2009; Moss et al. 2010; Moss and Oliver 2012). In line with this work, emotional change in this study was operationalized in terms of changes in self-reported state anxiety using the STAI, and whilst a reduction in anxiety would be viewed as a positive feature of the odorant, it is not possible to establish from the present data the extent to which pleasant odorants directly improved the participant’s own levels of happiness and any subsequent role this may have had in detecting happiness in the FICE task. Taking into account the anxiety-modulating capacities of the odorants at the start of the visual search task allows the priming capacities of the odorant to take effect, such that participants performing the task with the concurrent presentation of a pleasant odorant, where overall levels of anxiety are reduced, showed enhanced detection of happy face targets relative to participants exposed to the unpleasant odorant. This particular pattern extends Leppänen and Hietanen’s work by demonstrating how positive changes within the participant, as indexed in the current study in the form of reductions in self-reported anxiety, can facilitate the perception of happy facial expressions during more complex attentional tasks. However, the benefits of this reduction in anxiety on the perception of happiness is short-lived; towards the end of the visual search task, the anxiety-modulating capacities of odors become redundant in predicting the detection of happy targets. Surprisingly, unpleasant odors rather than pleasant ones facilitate the detection of happy faces. We argue that the unpleasant odor in the latter stage of the search process may serve to create a “pop-out” environmental context for the participants, directing their attention to environmentally incongruent emotional information (i.e., happy faces) as they engage in avoidance based strategies in response to inhaling the unpleasant odor (e.g., Fannes et al. 2008; Boesveldt et al. 2010). Thus, the increase in the happiness superiority effect towards the end of the experiment and its overall preservation in the unpleasant condition may result from successful negative affect repair processes to offset this increase in anxiety and any such threat-related cognitive biases associated with it (Byrne and Eysenck 1995). The fact that the unpleasant odor plays a more salient role towards the end of the experiment may reflect differences in habituation patterns between positive and negative odorant cues, with unpleasant odors taking considerably more time to habituate, especially in experimental designs in which participants are explicitly primed by rating an odorant for its level of unpleasantness (Dalton 1996; see also Seubert et al. 2014). Current explanations of the happiness superiority effect focus on the role of low level perceptual advantages afforded by the ‘toothy grin’ in happy facial expressions (e.g., Calvo and Nummenmaa 2008; Calvo and Marrero 2009; Horstmann et al. 2012). In the current study, the facial expressions used to measure the happiness superiority effect were taken from a database which included happy face exemplars with visually salient smiles, whilst the angry face exemplars lacked a visually salient ‘toothy snarl’ equivalent feature (Lipp et al. 2009; Horstmann et al. 2012). As such, angry faces would have shared a greater degree of perceptual overlap with neutral faces which also included this closed mouth feature, materializing in increased error rates and response times for angry target-neutral distracter crowds across the different conditions. The perceptual disadvantage of angry face targets over happy targets was reduced when the surrounding crowd consisted of happy distracters, in these conditions search performance was comparable to happy target-angry crowd search conditions, thus resulting in a happiness superiority effect that was only found in the neutral distracter context. In some instances, such interactions between target and distractors point towards the involvement of an attentional disengagement mechanism, whereby response times to detect happy targets is delayed when surrounded by angry distractor faces than neutral faces (e.g., Fox et al. 2000; Fox et al. 2001; Fox et al. 2002; Damjanovic et al. 2014). However, inferential analyses do not indicate that such a mechanism was involved in the current visual search task, as participants’ response times for happy target detection did not differ significantly between the angry and neutral distractors. Whilst the stability of the happiness superiority effect in the control condition would have been compatible with such perceptual based accounts of emotion detection, the time course analysis in the odor groups indicates that affective factors play a much more important role (e.g., Becker et al. 2011). Our early socialization experiences (Bond and Siddle 1996; Kotsoni et al. 2001) may help to reinforce an expectancy bias towards positive stimuli (Cacioppo and Gardner 1999; Cacioppo et al. 1999; Chipchase and Chapman 2007), yet the phasic characteristics of participants’ visual search times for happy facial expression targets, reveal how easily this bias can be reset. We propose that the emotional state of the participant plays an important role in the perception of facial expression of happiness (e.g., Niedenthal et al. 2000; Leppänen and Hietanen 2003; Becker et al. 2011), supporting the cross-modal interaction of affective cues in a time dependent manner (Walla 2008). Phasic analyses such as the ones performed in this study not only serve to highlight the complexity of such cross-modal interactions, but may also help pave the way for a better understanding on how affective versus perceptual accounts on emotion detection can be disentangled. Furthermore, such phasic analyses prove to be particularly helpful in reconciling differences between studies that would otherwise have been masked if the analyses and interpretation of olfactory-visual processing focused exclusively on overall reaction time performance (Moessnang et al. 2011). Future efforts in validating affective accounts of the happiness superiority effect may attempt to increase the arousal value of unpleasant odorants, for example, by using human chemosignals (e.g., Lundström et al. 2008; Zhou and Chen 2009) to establish whether they can open up the prioritization of threat specific cues. Thus, when environmental contexts differ not only in their pleasantness value, but also in terms of their heightened arousal levels, participants may then revert to searching for angry facial expressions instead (Becker et al. 2011; Chipchase and Chapman 2013; Lundqvist et al. 2013). Such systematic manipulations of an odor’s pleasantness and arousal values would also raise important implications for our understanding of anxiety-based models of attention (e.g., Eysenck 1992; Eysenck et al. 2007), and how this may differ functionally from a general expectancy bias favoring positive information (e.g., Diener and Diener 1996; Fox 2013), for example. Indeed, it is worth noting that the mean STAI (S) values for all 3 groups fell within the low-anxiety range reported in visual search studies of this kind. For instance, some studies using a split-groups design often pre-select individuals scoring high in trait anxiety (48>), and have identified this anxiety component as playing a more important role than high levels of state anxiety in facilitating the detection of angry facial expressions (e.g., Byrne and Eysenck 1995). Other studies have pre-screened participants on the basis of their state anxiety scores, allocating scores above 40 on the STAI to the high anxious group and scores below 35 to the low anxious group and have found this component of anxiety to play a stronger role in disrupting threat disengagement attentional processes (e.g., Fox et al. 2001). Thus, future studies should systematically consider both intrinsic and extrinsic facets of anxiety and their emotional weighting in terms of the attentional capture of angry and happy facial expressions. Our work indicates that the anxiety-modulating capacities of pleasant and unpleasant odors may serve as a useful tool towards achieving this aim in sub-clinical populations (Krippl 2003). Beyond the differences in attentional processing demands between the current study and the work by Leppänen and Hietanen, other aspects of our methodology may explain why the facilitative effects of the pleasant odor condition on the detection of happiness did not materialize in the initial analysis of overall reaction time performance. For example, the correspondence between the pleasantness value of the olfactory cues used by Leppänen and Hietanen and the facial stimulus may have been more strongly primed than in the current study given that participants were only required to make “same” versus “different” emotion judgments on each experimental trial, in contrast to Leppänen and Hietanen’s instructions which required participants to categorize each trial using emotion labelled response keys. Indeed, providing participants with emotion labels during the categorization task can also influence the percentage of intrusion errors, that is the false categorization of emotional expressions that were not presented in the target face such as perceiving sadness in an ambiguous neutral-disgust face (Leleu et al. 2015). Thus, the presence of verbal cues may have primed the cognitive system to search for a restricted set of emotion categories, amplifying the perception of salient congruent expressions such as happiness, whilst blurring the boundaries between less salient expressions such as disgust (Leleu et al. 2015). Furthermore, unlike Leppänen and Hietanen’s categorization task, the participants in the current study performed over 200 visual search trials without a break. Whilst this may have resulted in severe fatigue effects, no phasic effects were observed in the control condition, indicating that this was not a particular cause for concern with the current study. Nevertheless, the inclusion of a break in Leppänen and Hietanen’s study may have helped to increase the saliency of the pleasant odorant prime and its association with the happy facial expression in the emotion categorization task, thus resulting in the overall category-specific facilitative effects found. In interpreting these results, some limitations must be considered. First, rather than relying exclusively on self-report measures to determine functional olfactory sense, such measures should be combined with a screening test such as the Sniffin’ Sticks battery (Hummel et al. 2001) to ensure all participants could perceive the applied odors to normative levels. Furthermore, tighter control of adaptation mechanisms would help to provide a more informative picture of odorant-specific reduction in sensitivity and perceived intensity during odor exposure over the course of the experiment and its impact on cognitive performance (Ekman et al. 1967; Dalton et al. 1996). The addition of post-experiment interviews to assess each participant’s awareness of the odors in their testing environment would also provide a fruitful insight into the perceived intensity of each odorant in future research designs. For example, adding questions with respect to the explicit ability to perceive the odor in the room will help determine whether the contextual effects of odors on facial detection require explicit awareness or can occur implicitly. Second, alternative self-report measures such as the Positive and Negative Affect Schedule (PANAS; Watson et al. 1988) would provide a more detailed account of the types of emotions experienced by participants in response to pleasant and unpleasant odor contexts above and beyond their anxiety-modulating capacities and provide a more comprehensive way to assess the category-specific basis of odour-emotion perception interactions more fully. Notwithstanding these limitations, our study shows that affective factors in the form of changes in the emotional state of the participant play a more significant role in facilitating the detection of target facial expressions of emotion than the perceptual salience of the face’s features (Juth et al. 2005; Calvo and Nummenmaa 2008; Calvo and Marrero 2009; Becker et al. 2011). Indeed, contextual factors are a part of the multisensory nature of our emotional interaction with others, and the dynamic nature of the emotional state of the participant needs to play more of an active role in future research studies on attentional modulation, rather than limiting such investigations to an individual differences framework (Frischen et al. 2008). Along with music induction experiments (e.g., Garon et al. 2012; Rowe et al. 2007), our study reveals how odors may provide another useful tool for researchers to examine the role of affective factors in visual search tasks with emotionally salient stimuli. Funding This work was supported by the Departmental Seed Corn Fund and the Research Development Fund awarded to the first and second authors from the Research and Knowledge Transfer Office at the University of Chester. The research of L Damjanovic has been partly supported by the British Academy, funded under the Skills Acquisition Award scheme SQ130002. Acknowledgements We are grateful to an anonymous reviewer for recommending the mediation analysis during an earlier review of this manuscript. The authors wish to thank Panos Athanasopoulos for helpful discussions on earlier versions of the manuscript and Sam Roberts for advice on mediation analyses. We wish to acknowledge the assistance of Bryan Hiller and Rhian McHugh for facilitating with data collection. References Adolphs R. 2002. Recognizing emotion from facial expressions: psychological and neurological mechanisms. Behav Cogn Neurosci Rev . 1: 21– 62. Google Scholar CrossRef Search ADS PubMed  Becker DV, Anderson US, Mortensen CR, Neufeld SL, Neel R. 2011. The face in the crowd effect unconfounded: happy faces, not angry faces, are more efficiently detected in single- and multiple-target visual search tasks. J Exp Psychol Gen . 140: 637– 659. Google Scholar CrossRef Search ADS PubMed  Becker DV, Neel R, Srinivasan N, Neufeld S, Kumar D, Fouse S. 2012. The vividness of happiness in dynamic facial displays of emotion. PLoS One . 7: e26551. Google Scholar CrossRef Search ADS PubMed  Bensafi M, Rouby C, Farget V, Bertrand B, Vigouroux M, Holley A. 2002. Autonomic nervous system responses to odours: the role of pleasantness and arousal. Chem Senses . 27: 703– 709. Google Scholar CrossRef Search ADS PubMed  Black SL. 2001. Does smelling granny relieve depressive mood?: Commentary on ‘Rapid mood change and human odors’. Biol Psychol . 5: 215– 218. Google Scholar CrossRef Search ADS   Boesveldt S, Frasnelli J, Gordon AR, Lundström JN. 2010. The fish is bad: negative food odors elicit faster and more accurate reactions than other odors. Biol Psychol . 84: 313– 317. Google Scholar CrossRef Search ADS PubMed  Bond NW, Siddle DAT. 1996. The preparedness account of social phobia: some data and alternative explanations. In: Rapee RM, editor. Current controversies in the anxiety disorders . London: Guilford Press. p. 291– 316. Botvinick M, Nystrom LE, Fissell K, Carter CS, Cohen JD. 1999. Conflict monitoring versus selection-for-action in anterior cingulate cortex. Nature . 402: 179– 181. Google Scholar CrossRef Search ADS PubMed  Byrne A, Eysenck MW. 1995. Trait anxiety, anxious mood, and threat detection. Cognition Emotion . 9: 549– 562. Google Scholar CrossRef Search ADS   Cacioppo JT, Gardner WL. 1999. Emotion. Annu Rev Psychol . 50: 191– 214. Google Scholar CrossRef Search ADS PubMed  Cacioppo JT, Gardner WL, Berntson GG. 1999. The affect system has parallel and integrative processing components: Form follows function. J Pers Soc Psychol . 76: 839– 855. Google Scholar CrossRef Search ADS   Calvo MG, Marrero H. 2009. Visual search of emotional faces: the role of affective content and featural distinctiveness. Cognition Emotion . 22: 1– 25. Calvo MG, Nummenmaa L. 2008. Detection of emotional faces: salient physical features guide effective visual search. J Exp Psychol Gen . 137: 471– 494. Google Scholar CrossRef Search ADS PubMed  Chipchase S, Chapman P. 2007. The mere exposure effect with emotionally valenced stimuli: Analytic and non-analytic processing. P Exptl Psych Soc . 60: 1697– 1715. Chipchase S, Chapman P. 2013. Trade-offs in visual attention and the enhancement of memory specificity for positive and negative emotional stimuli. Q J Exp Psychol . 66: 77– 298. Google Scholar CrossRef Search ADS   Chu S, Downes JJ. 2000. Odour-evoked autobiographical memories: psychological investigations of proustian phenomena. Chem Senses . 25: 111– 116. Google Scholar CrossRef Search ADS PubMed  Dalton P. 2000. Psychophysical and behavioral characteristics of olfactory adaptation. Chem Senses . 25: 487– 492. Google Scholar CrossRef Search ADS PubMed  Dalton P. 1996. Odor perception and beliefs about risk. Chem Senses . 21: 447– 458. Google Scholar CrossRef Search ADS PubMed  Dalton P, Wysocki CJ. 1996. The nature and duration of adaptation following long-term odor exposure. Percept Psychophys . 58: 781– 792. Google Scholar CrossRef Search ADS PubMed  Damjanovic L, Meyer M, Sepulveda F. 2017. Raising the alarm: individual differences in the perceptual awareness of masked facial expressions. Brain Cogn . 114: 1– 10. Google Scholar CrossRef Search ADS PubMed  Damjanovic L, Pinkham AE, Clarke P, Phillips J. 2014. Enhanced threat detection in experienced riot police officers: cognitive evidence from the face-in-the-crowd effect. Q J Exp Psychol (Hove) . 67: 1004– 1018. Google Scholar CrossRef Search ADS PubMed  Damjanovic L, Roberson D, Athanasopoulos P, Kasai C, Dyson M. 2010. Searching for happiness across cultures. J Cogn Cult . 10: 85– 107. Google Scholar CrossRef Search ADS   Damjanovic L, Santiago J. 2016. Contrasting vertical and horizontal representations of affect in emotional visual search. Psychon Bull Rev . 23: 62– 73. Google Scholar CrossRef Search ADS PubMed  Diener E, Diener C. 1996. Most people are happy. Psychol Sci . 7: 181– 185. Google Scholar CrossRef Search ADS   Ekman G, Berglund B, Berglund U, Lindvall T. 1967. Perceived intensity of odor as a function of time of adaptation. Scand J Psychol . 8: 177– 186. Google Scholar CrossRef Search ADS PubMed  Ekman P, Freisen WV. 1976. Pictures of facial affect . Palo Alto, CA: Consulting Psychologists Press. Ekman P, Freisen WV, Hager JC. 2002. The facial action coding system  ( 2nd ed.). Salt Lake City, UT Research Nexus eBook. Eysenck MW. 1992. Anxiety: the cognitive perspective . Hillsdale, NJ: Lawrence Erlbaum Associates. Eysenck MW, Derakshan N, Santos R, Calvo MG. 2007. Anxiety and cognitive performance: attentional control theory. Emotion . 7: 336– 353. Google Scholar CrossRef Search ADS PubMed  Fannes S, Van Diest I, Meulders A, De Peuter S, Vansteenwegen D, Van den Bergh O. 2008. To inhale or not to inhale: conditioned avoidance in breathing behavior in an odor–20% CO2 paradigm. Biol Psychol . 78: 87– 92. Google Scholar CrossRef Search ADS PubMed  Fox E. 2013. Rainy brain, sunny brain: The new science of optimism and pessimism . London: Arrow Books. Fox E, Lester V, Russo R, Bowles RJ, Pichler A, Dutton K. 2000. Facial expressions of emotion: are angry faces detected more efficiently? Cogn Emot . 14: 61– 92. Google Scholar CrossRef Search ADS PubMed  Fox E, Damjanovic L. 2006. The eyes are sufficient to produce a threat superiority effect. Emotion . 6: 534– 539. Google Scholar CrossRef Search ADS PubMed  Fox E, Russo R, Bowles R, Dutton K. 2001. Do threatening stimuli draw or hold visual attention in subclinical anxiety? J Exp Psychol Gen . 130: 681– 700. Google Scholar CrossRef Search ADS PubMed  Fox E, Russo R, Dutton K. 2002. Attentional bias for threat: evidence for delayed disengagement from emotional faces. Cogn Emot . 16: 355– 379. Google Scholar CrossRef Search ADS PubMed  Frischen A, Eastwood JD, Smilek D. 2008. Visual search for faces with emotional expressions. Psychol Bull . 134: 662– 676. Google Scholar CrossRef Search ADS PubMed  Garon M, Sirois S, Blanchette I. ( 2012). L’humeur induite par la musique influence le traitement visuel de la menace. Congrès annuel de la Société Québécoise pour la Recherche en Psychologie. Sherbrooke, QC. Hager JC, Ekman P. 1979. Long-distance transmission of facial affect signals. Ethol Sociobiol . 1: 77– 82. Google Scholar CrossRef Search ADS   Hansen CH, Hansen RD. 1988. Finding the face in the crowd: an anger superiority effect. J Pers Soc Psychol . 54: 917– 924. Google Scholar CrossRef Search ADS PubMed  Horst P. 1941. The role of predictor variables which are independent of the criterion. Soc Sci Res Bull . 48: 431– 436. Horstmann G, Lipp OV, Becker SI. 2012. Of toothy grins and angry snarls–open mouth displays contribute to efficiency gains in search for emotional faces. J Vis . 12: 7. Google Scholar CrossRef Search ADS PubMed  Hummel T, Konnerth CG, Rosenheim K, Kobal G. 2001. Screening of olfactory function with a four-minute odor identification test: reliability, normative data, and investigations in patients with olfactory loss. Ann Otol Rhinol Laryngol . 110: 976– 981. Google Scholar CrossRef Search ADS PubMed  Juth P, Lundqvist D, Karlsson A, Ohman A. 2005. Looking for foes and friends: perceptual and emotional factors when finding a face in the crowd. Emotion . 5: 379– 395. Google Scholar CrossRef Search ADS PubMed  Kotsoni E, de Haan M, Johnson MH. 2001. Categorical perception of facial expressions by 7-month-old infants. Perception . 30: 1115– 1125. Google Scholar CrossRef Search ADS PubMed  Krippl M. 2003. Induction of emotions and motivations by odours. Chem Senses . 28: 71– 77. Google Scholar CrossRef Search ADS   Leleu A, Demily C, Franck N, Durand K, Schaal B, Baudouin JY. 2015. The odor context facilitates the perception of low-intensity facial expressions of emotion. PLoS One . 10: e0138656. Google Scholar CrossRef Search ADS PubMed  Leppänen JM, Hietanen JK. 2003. Affect and face perception: odors modulate the recognition advantage of happy faces. Emotion . 3: 315– 326. Google Scholar CrossRef Search ADS PubMed  Li W, Moallem I, Paller KA, Gottfried JA. 2007. Subliminal smells can guide social preferences. Psychol Sci . 18: 1044– 1049. Google Scholar CrossRef Search ADS PubMed  Lipp OV, Price SM, Tellegen CL. 2009. No effect of inversion on attentional and affective processing of facial expressions. Emotion . 9: 248– 259. Google Scholar CrossRef Search ADS PubMed  Lundqvist D, Juth P, Öhman A. 2013. Using facial emotional stimuli in visual search experiments: the arousal factor explains contradictory results. Cogn Emot . 28: 1012– 1029. Google Scholar CrossRef Search ADS PubMed  Lundström JN, Boyle JA, Zatorre RJ, Jones-Gotman M. 2008. Functional neuronal processing of body odors differs from that of similar common odors. Cereb Cortex . 18: 1466– 1474. Google Scholar CrossRef Search ADS PubMed  MacKinnon DP, Fairchild AJ, Fritz MS. 2007. Mediation analysis. Annu Rev Psychol . 58: 593– 614. Google Scholar CrossRef Search ADS PubMed  Maddock RJ. 1999. The retrosplenial cortex and emotion: new insights from functional neuroimaging of the human brain. Trends Neurosci . 22: 310– 316. Google Scholar CrossRef Search ADS PubMed  Matsumoto D. 2002. Methodological requirements to test a possible in-group advantage in judging emotions across cultures: comment on Elfenbein and Ambady (2002) and evidence. Psychol Bull . 128: 236– 242. Google Scholar CrossRef Search ADS PubMed  Matsumoto D, Ekman P. 1988. Japanese and Caucasian facial expressions of emotion (JACFEE) and (JACNeuf) . San Francisco, CA: San Francisco State University. McFatter RM. 1979. The use of structural equation models in interpreting regression equations including suppressor and enhancer variables. Appl Psych Meas . 3: 123– 135. Google Scholar CrossRef Search ADS   Moessnang C, Finkelmeyer A, Vossen A, Schneider F, Habel U. 2011. Assessing implicit odor localization in humans using a cross-modal spatial cueing paradigm. PLoS One . 6: e29614. Google Scholar CrossRef Search ADS PubMed  Moss M, Hewitt S, Moss L, Wesnes K. 2008. Modulation of cognitive performance and mood by aromas of peppermint and ylang-ylang. Int J Neurosci . 118: 59– 77. Google Scholar CrossRef Search ADS PubMed  Moss M, Oliver L. 2012. Plasma 1,8-cineole correlates with cognitive performance following exposure to rosemary essential oil aroma. Ther Adv Psychopharmacol . 2: 103– 113. Google Scholar CrossRef Search ADS PubMed  Moss L, Rouse M, Wesnes KA, Moss M. 2010. Differential effects of the aromas of Salvia species on memory and mood. Hum Psychopharmacol . 25: 388– 396. Google Scholar CrossRef Search ADS PubMed  Niedenthal PM, Halberstadt JB, Margolin J, Innes-Ker ÅH. 2000. Emotional state and the detection of change in facial expression of emotion. Eur J Soc Psychol . 30: 211– 222. Google Scholar CrossRef Search ADS   Öhman A, Lundqvist D, Esteves F. 2001. The face in the crowd revisited: a threat advantage with schematic stimuli. J Pers Soc Psychol . 80: 381– 396. Google Scholar CrossRef Search ADS PubMed  Öhman A, Mineka S. 2001. Fears, phobias, and preparedness: toward an evolved module of fear and fear learning. Psychol Rev . 108: 483– 522. Google Scholar CrossRef Search ADS PubMed  Paulhus DL, Robins RW, Trzesniewski KH, Tracy JL. 2004. Two Replicable Suppressor Situations in Personality Research. Multivariate Behav Res . 39: 303– 328. Google Scholar CrossRef Search ADS PubMed  Phillips ML, Heining M. 2002. Neural correlates of emotion perception: From faces to taste. In: Rouby, C, Schaal B, Dubois D, Gervais R, Holley A, editors. Olfaction, taste, and cognition  (pp. 196– 208). Cambridge, UK: Cambridge University Press. Google Scholar CrossRef Search ADS   Pinkham AE, Griffin M, Baron R, Sasson NJ, Gur RC. 2010. The face in the crowd effect: anger superiority when using real faces and multiple identities. Emotion . 10: 141– 146. Google Scholar CrossRef Search ADS PubMed  Preacher KJ, Hayes AF. 2008. Asymptotic and resampling strategies for assessing and comparing indirect effects in multiple mediator models. Behav Res Methods . 40: 879– 891. Google Scholar CrossRef Search ADS PubMed  Rowe G, Hirsh JB, Anderson AK. 2007. Positive affect increases the breadth of attentional selection. Proc Natl Acad Sci USA . 105: 383– 388. Google Scholar CrossRef Search ADS   Savage RA, Lipp OV, Craig BM, Becker SI, Horstmann G. 2013. In search of the emotional face: anger versus happiness superiority in visual search. Emotion . 13: 758– 768. Google Scholar CrossRef Search ADS PubMed  Seubert J, Gregory KM, Chamberland J, Dessirier JM, Lundström JN. 2014. Odor valence linearly modulates attractiveness, but not age assessment, of invariant facial features in a memory-based rating task. PLoS One . 9: e98347. Google Scholar CrossRef Search ADS PubMed  Seubert J, Kellermann T, Loughead J, Boers F, Brensinger C, Schneider F, Habel U. 2010a. Processing of disgusted faces is facilitated by odor primes: a functional MRI study. Neuroimage . 53: 746– 756. Google Scholar CrossRef Search ADS   Seubert J, Loughead J, Kellermann T, Boers F, Brensinger CM, Habel U. 2010b. Multisensory integration of emotionally valenced olfactory-visual information in patients with schizophrenia and healthy controls. J Psychiatry Neurosci . 35: 185– 194. Google Scholar CrossRef Search ADS   Shrout PE, Bolger N. 2002. Mediation in experimental and nonexperimental studies: new procedures and recommendations. Psychol Methods . 7: 422– 445. Google Scholar CrossRef Search ADS PubMed  Sookoian S, Burgueño A, Gianotti TF, Marillet G, Pirola CJ. 2011. Odor perception between heterosexual partners: its association with depression, anxiety, and genetic variation in odorant receptor OR7D4. Biol Psychol . 86: 153– 157. Google Scholar CrossRef Search ADS PubMed  Spielberger CD, Gorsuch RL, Lushene R, Vagg P, Jacobs G. ( 1983). Manual for the State-Trait Anxiety Inventory . Palo Alto: Consulting Psychologists Press. Treisman AM, Gelade G. 1980. A feature-integration theory of attention. Cogn Psychol . 12: 97– 136. Google Scholar CrossRef Search ADS PubMed  Walla P. 2008. Olfaction and its dynamic influence on word and face processing: cross-modal integration. Prog Neurobiol . 84: 192– 209. Google Scholar CrossRef Search ADS PubMed  Watson D, Clark LA, Tellegen A. 1988. Development and validation of brief measures of positive and negative affect: the PANAS scales. J Pers Soc Psychol . 54: 1063– 1070. Google Scholar CrossRef Search ADS PubMed  Whalen PJ. 1998. Fear, vigilance, and ambiguity: Initial neuroimaging studies of the human amygdala. Curr Dir Psychol Sci . 7: 177– 188. Google Scholar CrossRef Search ADS   Zhou W, Chen D. 2009. Fear-related chemosignals modulate recognition of fear in ambiguous facial expressions. Psychol Sci . 20: 177– 183. Google Scholar CrossRef Search ADS PubMed  © The Author(s) 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

Journal

Chemical SensesOxford University Press

Published: Mar 1, 2018

There are no references for this article.

You’re reading a free preview. Subscribe to read the entire article.


DeepDyve is your
personal research library

It’s your single place to instantly
discover and read the research
that matters to you.

Enjoy affordable access to
over 18 million articles from more than
15,000 peer-reviewed journals.

All for just $49/month

Explore the DeepDyve Library

Search

Query the DeepDyve database, plus search all of PubMed and Google Scholar seamlessly

Organize

Save any article or search result from DeepDyve, PubMed, and Google Scholar... all in one place.

Access

Get unlimited, online access to over 18 million full-text articles from more than 15,000 scientific journals.

Your journals are on DeepDyve

Read from thousands of the leading scholarly journals from SpringerNature, Elsevier, Wiley-Blackwell, Oxford University Press and more.

All the latest content is available, no embargo periods.

See the journals in your area

DeepDyve

Freelancer

DeepDyve

Pro

Price

FREE

$49/month
$360/year

Save searches from
Google Scholar,
PubMed

Create lists to
organize your research

Export lists, citations

Read DeepDyve articles

Abstract access only

Unlimited access to over
18 million full-text articles

Print

20 pages / month

PDF Discount

20% off