Abstract In a double-blind experiment, participants were exposed to facial images of anger, disgust, fear, and neutral expressions under 2 body odor conditions: fear and neutral sweat. They had to indicate the valence of the gradually emerging facial image. Two alternative hypotheses were tested, namely a “general negative evaluative state” hypothesis and a “discrete emotion” hypothesis. These hypotheses suggest 2 distinctive data patterns for muscle activation and classification speed of facial expressions. The pattern of results that would support a “discrete emotions perspective” would be expected to reveal significantly increased activity in the medial frontalis (eyebrow raiser) and corrugator supercilii (frown) muscles associated with fear, and significantly decreased reaction times (RTs) to “only” fear faces in the fear odor condition. Conversely, a pattern of results characterized by only a significantly increased corrugator supercilii activity together with decreased RTs for fear, disgust, and anger faces in the fear odor condition would support an interpretation in line with a general negative evaluative state perspective. The data support the discrete emotion account for facial affect perception primed with fear odor. This study provides a first demonstration of perception of discrete negative facial expressions using olfactory priming. emotion categorization, face processing, fear odor, olfactory priming Introduction We use multiple senses to navigate the social world. Among these, our sense of smell has an important social communicative function (Semin and de Groot 2013), given that it can pick up a rich range of information contained in human odors produced by our apocrine, and to some extent, eccrine sweat glands and areola (Doucet et al. 2012). Human body odors (chemosignals) transmit information about gender (Penn et al. 2007), age (Mitro et al. 2012), health status (Olsson et al. 2014), and transient emotional states (e.g., de Groot et al. 2012), among others. Despite the remarkable sensitivity of human olfaction and the wealth of information transmitted by human chemosignals, the field of human chemosignals remains underestimated. As Pause (2012) pointed out, human body odors (chemosignals) constitute a special class of stimuli, as they can be effective in darkness and across diverse barriers, and can be potentially transmitting information over longer distances as a function of their volatility, with low-volatile molecules remaining at the position of their release over longer periods of time and retaining information about events in the past (Gaby and Zayas 2017). Recent research has highlighted the significance of human chemosignals in conveying emotions to recipients (e.g., de Groot et al. 2012). It is well known that emotional facial expressions play a crucial role in human communication. Although auditory-visual integration of emotional stimuli has been studied widely, emotional chemosensory modulation of the perception of facial emotions is not even acknowledged, as the most recent review of emotion perception shows (Schirmer and Adolphs 2017). Effective interpersonal communication requires, among other things, the identification of others’ facial expressions. However, perception of facial expressions is based on not only visual cues but also contextual information from different channels (Stein and Stanford 2008). One such important subliminal source of contextual information is common odors (not human body odors) that may influence the way we perceive emotions in faces (Leppanen and Hietanen 2003; Seubert et al. 2010; Zernecke et al. 2011; Leleu et al. 2015a, 2015b). In a study with human body odors, participants were primed with a happy face and were asked to evaluate neutral faces in the context of anxiety versus non-anxiety chemosignals (Pause et al. 2004). The results showed that the influence of priming with happy faces declined in the context of anxiety chemosignals in the case of females. Another study demonstrated that exposure to underarm odor from fearful individuals led ambiguous faces to being evaluated as fearful rather than happy (Zhou and Chen 2009). Thus, fear odor influenced receivers to perceive ambiguous faces as fearful. However, the nature of individuals’ affective processing when influenced by body odors produced under emotional conditions has not been investigated. This research aims to shed light on which of 2 current models best accounts for emotional processing, namely: 1) a discrete emotion approach or 2) a dimensional emotion approach (e.g., Barrett 1998). The outcome of this research will also shed light on the chemical properties of the body odor. If the former approach is supported, then it would mean that fear chemosignals have a distinctive fingerprint. If, however, the second model is supported, then it would mean that fear chemosignals induce a general state and do not have chemical properties distinctive to fear. The first model suggests that individuals experience emotions such as fear as a distinct state with discrete boundaries. Closely related with the evolutionary view, each discrete emotion is regarded as the result of a unique individual–environment interaction that evolved with its own adaptive significance to deal with related life problems such as competition (anger), danger (fear), or cooperation (happiness) (Ekman 1992; Izard 1992; Darwin 1872/1998). The evidence for discrete emotion approach comes from emotional communication research, suggesting facial expressions are universally expressed and recognized (Ekman 1992, 1994). Each emotion is also thought to have distinct physiological patterns created on the basis of environmental demands. Evidence for physiological differences between discrete emotions derives from specific brain substrates associated with discrete emotions (e.g., Phan et al. 2002; Murphy et al. 2003) as well as autonomic discriminability of some discrete emotions (Levenson et al. 1990; Levenson 1992; Christie and Friedman 2004). The second, dimensional, approach focuses on subjective feeling states and a small number of underlying dimensions that may account for the differences between emotional states. The prominent model is a 2D model that proposes activation and valence as the dimensions (e.g., Russell 1980; Larsen and Diener 1992). Although the activation dimension ranges from sleep to excitement referring to a subjective sense of energy (e.g., Russell and Barrett 1999; Barrett 2006a, 2006b), the valence dimension refers to general feelings of valence (undifferentiated evaluative process) ranging between 2 poles, namely negative to positive or unpleasant to pleasant (Barrett 2006a). In this approach, functionally, negative and positive emotions are assumed to transmit information about whether an engaged behavior is going well or poorly (e.g., Carver 2001). We aim to contrast these 2 models, namely the one predicting a “discrete” emotional state and the other predicting a general valence-based “evaluation” process in a novel emotional face processing paradigm by inducing a simulacrum of an emotional state with olfactory stimuli. The present study This study used olfactory priming to differentiate whether emotional face perception is driven by discrete emotional states (fear) or general evaluation (negative) using a novel facial recognition task. By relying on chemosignals as a prime rather than any linguistically composed instruction or manipulation (e.g., categorization; Barrett 2006b), we avoided the potential interference of language-driven processes. The experiment constitutes a unique paradigm that bypasses a significant handicap that may be introduced by procedures that involve language. The experiment was designed to test between the 2 alternative hypotheses by using sweat samples collected from donors subjected to fear and neutral induction conditions. In the recipient phase of the experiment, participants were exposed to facial images of anger, disgust, fear, and a neutral face while they were exposed to fear or neutral sweat collected from the donors. Each facial stimulus was presented such that it changed gradually from a blurred image to a clear one and participants’ task was to indicate if the facial image was a neutral one or a negative one, as soon as they could identify it. One of the 2 predictions was if fear odor induces a “general negative evaluative state,” then all negative facial images (anger, disgust, and fear) should be processed equally fast in the fear odor condition. Moreover, the negative facial images were hypothesized to be processed faster in the fear odor condition, compared to the neutral body odor condition. The alternative prediction was if the fear odor induces a “discrete fear state,” then only the facial images of fear should be processed faster in the fear odor condition. Furthermore, according to the discrete fear state approach, one would expect that the fear odor would induce a distinctive activation pattern of facial muscles (i.e., medial frontalis) that is specific for the emotion of fear (see Susskind et al. 2008; de Groot et al. 2012). Facial expressions are known to facilitate emotion recognition and to promote a discrete and adaptive physical form of sensory gating (Farb et al. 2013). Specifically, lifting the eyebrow (i.e., medial frontalis activity) activated in fear states causes an increase in visual field size and serves the adaptive scanning of the environment for threat (Susskind et al. 2008). Conversely, according to the general negative evaluative state approach, fear odor would activate a general negative affect represented by only a frown (i.e., corrugator supercilii activity) (e.g., Hess and Fischer 2013). These considerations led to the second objective of the reported study. Do participants display the distinctive facial expression of fear when exposed to fear odor as the discrete fear state approach (medial frontalis, and the usually coactivated corrugator supercilii) would predict (in multiple studies, fear odor not only induced a medial frontalis response but also coactivated the corrugator supercilii muscle [de Groot et al. 2014a, 2014b, 2015b])? To summarize, the 2 hypotheses, “discrete emotion” versus general negative evaluative state approach, suggest 2 distinctive data patterns for muscle activation and classification speed of facial expressions. The pattern of results that would support a discrete emotions perspective would be expected to reveal significantly increased activity in the medial frontalis and corrugator supercilii muscles associated with significantly decreased reaction times (RTs) to fear faces exclusively in the fear odor condition. Conversely, a pattern of results characterized by a significantly increased corrugator supercilii activity only together with no difference in RTs for fear, disgust, and anger faces in the fear odor condition and a faster classification of the negative faces in the fear condition compared to the neutral condition would support an interpretation in line with a general negative evaluative state perspective. Any results in-between these 2 patterns would be inconclusive. The resulting experiment was a within-participants 2 Odor Type (fear and neutral) × 4 Emotional Face Image (fear, anger, disgust, and neutral) design and was preregistered (https://osf.io/b932h/). Materials and methods Ethics statement The study was approved by the Institutional Review Board of Utrecht University, Utrecht, the Netherlands. All participants provided written informed consent before participation and were free to stop at any point in time during the experiment. Participants Twenty-four right-handed Caucasian females (Mage = 22.21, SDage = 2.57, range = 19–28 years old) participated in the experiment as odor receivers. Sample size (N = 24) was predetermined by a power analysis (G*Power 3.1; Faul et al. 2007) for ANOVA, given f = 0.25, power = 0.8, α = 0.05. Effect size (f = 0.25) was transformed from η2(0.06) that is obtained from similar research on the fear-related olfactory modulation of visual perception (Zhou and Chen 2009; de Groot et al. 2015b; cf. a recent meta-analysis corroborating the chosen effect size, de Groot and Smeets 2017). As in previous research (e.g., Zhou and Chen 2009; de Groot et al. 2015b), only females were recruited because they are more sensitive to emotional signals (Brody and Hall 2000) and have a better sense of smell than men (Brand and Millot 2001). After filling out a detailed screening questionnaire, individuals were included if they were right-handed, heterosexual, nonsmoker, nonpregnant, and healthy, with no history of psychological or neurological disorder. They could not be included if they had a respiratory disease or any other condition (such as illness, cold, or allergy) that could affect olfactory functioning. To establish normal olfactory function (normosmia), participants had to identify 3 odors: cinnamon, fish odor, and banana (Lötsch et al. 2016). All participants were able to name at least 1 of the 3 items. Cinnamon, the best assessor of normosmia (Lötsch et al. 2016), was identified correctly by 18 participants, whereas the remaining 6 provided similar names (e.g., herb, plant). Ten participants were on hormone contraceptives. The remainder were on average 12.09 days (standard deviation [SD] = 8.40) from day 1 of their menstrual cycle. Hormone contraceptives that might have a potential effect on the odor perception of women (Kollndorfer et al. 2016) did not influence participants’ ratings on intensity (t(22) = 0.15, P = 0.882) or pleasantness (t(22) = 0.39, P = 0.700) of sweat pads collected in the fear condition as well as intensity (t(22) = −0.353, P = 0.728) or pleasantness (t(11.832) = 1.627, P = 0.130) of sweat pads collected under neutral condition. Participants were paid €8 for their participation. Materials and procedure Selection of emotional facial pictures Grayscale versions of 6 female and 6 male actors’ facial images from the Radboud Faces Database (Langner et al. 2010) were selected as visual stimuli for fear, disgust, anger, and the neutral emotion condition. To avoid a confound of emotional intensity of facial images (i.e., level of negativity depicted in negative facial images), in a pilot study, each participant (N = 15) rated the negativity of the facial images on a 7-point scale ranging from “not at all negative” to “extremely negative” (see Supplementary material). Emotional facial images were presented in a randomized order. Three female and 3 male facial images with highest negativity ratings for fear, disgust, and anger were selected, respectively (see Supplementary material available online). Presentation of visual stimuli The selected facial images were blurred with a Gaussian filter using MATLAB, Release 14 (MathWorks). A continuum of 30 images was generated for each actor’s emotional facial image using a blurring technique with 2 SD increments from 0 SD to 58 SD. Each continuum of a facial image was viewed by participants as changing from completely noisy to clear and presented over a period of 5000 ms (Figure 1). Visual stimulus presentation and data collection were controlled by a computer running PsychoPy (Peirce 2007). Emotional facial images were centered on the screen. Colored keyboard buttons were placed in front of the participants to collect their behavioral responses. Response keys were “g” and “h” on the computer keyboard that were colored with green and blue stickers. Figure 1. View largeDownload slide Facial images were constructed as a continuum of 30 images that gradually became clearer over time. The stimuli were displayed for a duration of 5000 ms following a fixation point for 5000 ms. Figure 1. View largeDownload slide Facial images were constructed as a continuum of 30 images that gradually became clearer over time. The stimuli were displayed for a duration of 5000 ms following a fixation point for 5000 ms. Preparation of olfactory stimuli Sweat pads from individuals induced to be in a fearful and neutral state served as olfactory stimuli. The sweat pads came from 6 (from a total sample of 12) donors for whom the fear induction procedure was most effective, evidenced by the highest fear ratings and the greatest sweat production (for details, see Supplementary material). The same procedure was followed for the selection of neutral sweat pads. Each “original” sweat pad (100 cm2) was cut into 8 pieces (12.5 cm2), and each receiver was presented with a polypropylene jar containing 4 pad pieces from 4 individuals to decrease individual variability in sweat production. The 4 pad parts were from the left (2 parts) and right (2 parts) armpits. Experimental procedure The experiment was conducted in a stand-alone manner by a female experimenter to avoid changes in female participants’ mood in the presence of a male experimenter (Jacob et al. 2001). Each participant received a new vial, 100 mL (60 × 71 mm) polypropylene jar (Greiner Bio-One), containing the olfactory stimuli that were defrosted 30 min before odor exposure. Each vial was marked by a code conceived by a researcher who was not involved in the experiment. The experiment was a double-blind study, where participants and experimenters did not know which vial contained which odor. On arrival at the laboratory, participants were informed about the general experimental procedure, namely that she would be performing tasks on a computer and some physiological measurements of facial activity would be recorded. Electrode placement for the facial electromyographic (EMG) measurements took place in a preparation room. The experimenter cleaned the skin on the left side of participant’s face (i.e., involved most strongly in spontaneous affective reactions in right-handed individuals, see Dimberg and Petterson 2000) with alcohol and abrasive lotion (LemonPrep; Mavidon) and applied the electrodes over medial frontalis and corrugator supercilii muscles, known to be associated with the experience of fear, to measure EMG activity (e.g., Ekman and Friesen 1978; van Boxtel 2010; Kret et al. 2013; Du et al. 2014). The electrode placement followed the guidelines provided by Fridlund and Cacioppo (1986). Each participant was led to a quiet individual cubicle and was asked to take a seat on an adjustable chair. Before starting, she was asked to place her head on a chin rest. The dual purpose of the chin rest was to stabilize participants’ heads and to ensure that their noses were at a constant distance (2 cm) from the odor, which was achieved by placing the odor-containing vial into an adjustable clamp that was attached to the right arm of the chin rest. The procedure included 2 identical sessions, each consisting of a practice and a test trial except for the order of the vials containing the fear or neutral odor, which was counterbalanced. The experimenter first placed the appropriate vial, without opening the lid, in the chin rest. The first session started with the practice trials during which the lid of the vial was still closed. Each test trial started with a fixation cross appearing in the middle of the screen for 5000 ms. After the practice trials, participants wore a noseclip to prevent preliminary sniffs. The lid of the vial was opened and the noseclip was removed with the start of the test trials. That exact time point was also used as a marker in the online registration of EMG data marking the start of test trials. In each session, the emotion recognition task consisted of 12 practice trials and 72 test trials. During the test trials, 36 facial images expressing negative emotions (12 for each emotion category) and 36 neutral faces were presented. The chance recognition for each facial image (negative vs. neutral) was 50%, so that response habituation was minimized. The participants had 2 colored buttons on the keyboard in front of them. They were instructed to press the assigned buttons on the keyboard to indicate if the facial image was negative or neutral as soon as they could clearly identify the image and to do so as accurately and quickly as possible. The order of the button labels (negative or neutral), the starting order of the responding hand (left or right), and the order of the odor manipulation (fear or neutral) were counterbalanced between participants. After the first session was completed, the second vial was presented after a washout period of at least 5 min. The experimenter changed the vial and the above procedure was followed for the next odor exposure session. After the experiment, participants first completed a 10-item handedness scale examining which hand they use for a range of activities in everyday life (Van Strien 1992) to control possible effect of handedness on the EMG data. All the participants were right-handed (M = 17.54, SD = 3.06). They were also asked to evaluate hedonic value (pleasantness) and the intensity of the sweat samples they were exposed to during the experiment on 7-point Likert scales, the ends of which were anchored with “not at all” and “very much” as odor assessment. Then, on 4 trials they performed a 2-alternative forced choice reminder task (de Groot et al. 2014a) to evaluate their ability to discriminate between sweat collected from donors in the fearful and neutral conditions. During the task, a reminder odor stimulus was presented first and then participants indicated which of 2 odor stimuli (presented 2nd or 3rd) corresponded to the reminder odor on 4 trials. Finally, participants wrote down what they smelled during the experiment and the purpose of the experiment in an open-ended manner as awareness check. Five participants identified the olfactory stimulus as sweat; yet none of the participants guessed correctly the purpose of the study. The pleasantness ratings of fear sweat pads did not differ between participants who identified sweat and who did not (t(22) = 0.279, P = 0.783). There was also no difference between identifiers and nonidentifiers in the case of neutral sweat pads (t(22) = 1.057, P = 0.302). Statistical analysis Before actual data analysis, EMG and RT data were checked for outliers that were identified as values exceeding 2.5 median absolute deviation (MAD) units (Leys et al. 2013). When outliers were revealed (e.g., 2.8% of the RT data), then these values were altered to be 1 unit above the next extreme score on that variable according to Field (2009). EMG data were removed in the case of 5 participants. This was because for specific individuals and specific muscles there were too many missing data (>75%, per participant) on the medial frontalis (participants 2 and 9) and the corrugator supercilii (participants 9, 12, and 19). RT data analysis was based on correct responses only. Responses were scored as correct if a participant pressed the negative button for the angry, fear, and disgust face or the neutral button for neutral faces. Percentage of correct responses for each emotional facial expression and odor condition is given in Table 1. EMG data analysis relied on the baseline-corrected facial muscle activity. The baseline was selected as 0–600 ms after the removal of the noseclip as a typical first sniff starts after approximately 400 ms (Sela and Sobel 2010). Baseline-corrected medial frontalis and corrugator supercilii activity was calculated as baseline (0–600 ms) subtracted from mean activity of the corresponding muscle. EMG activity was measured in 200-ms intervals, but to ease the interpretation of the expected odor × time interaction, five 200-ms intervals were averaged (4 times) to create 1-s post-baseline average EMG activity. Table 1. Percentage of correct responses per condition Fearful faces Angry faces Disgusted faces Neutral faces Fear odor 93.40 89.93 98.61 95.37 Neutral odor 93.06 89.58 98.96 96.18 Fearful faces Angry faces Disgusted faces Neutral faces Fear odor 93.40 89.93 98.61 95.37 Neutral odor 93.06 89.58 98.96 96.18 View Large Table 1. Percentage of correct responses per condition Fearful faces Angry faces Disgusted faces Neutral faces Fear odor 93.40 89.93 98.61 95.37 Neutral odor 93.06 89.58 98.96 96.18 Fearful faces Angry faces Disgusted faces Neutral faces Fear odor 93.40 89.93 98.61 95.37 Neutral odor 93.06 89.58 98.96 96.18 View Large Results Facial emotion categorization task Means and SDs of RTs for each emotional facial expression and odor condition are given in Table 2. A 2 × 4 odor condition (fear and neutral) × emotional face image (anger, fear, disgust, and neutral) repeated measures ANOVA was conducted to compare effects of odor condition and emotional face image on RT data. Although the main effect of odor condition was not significant (F(1, 23) = 0.03, P = 0.860, ηp2 = 0.00), a main effect of emotional face image (F(3, 69) = 182.88, P < 0.001, ηp2 = 0.89) on RT was found. The results revealed that angry faces (M = 4.53, SD = 0.19) were responded to significantly slower than fearful (M = 4.18, SD = 0.26; P < 0.001) and disgusted faces (M = 4.13, SD = 0.22; P < 0.001). Fearful and disgusted faces were responded significantly differently from each other (P < 0.05) as well as neutral faces (M = 4.50, SD = 0.20; P < 0.001; P < 0.001). Table 2. Mean RTs (ms) and SDs for the fearful, angry, disgusted, and neutral facial expressions in fear and neutral odor conditions Fearful faces Angry faces Disgusted faces Neutral faces Mean SD Mean SD Mean SD Mean SD Fear odor 4.16 0.30 4.53 0.24 4.13 0.25 4.51 0.22 Neutral odor 4.20 0.26 4.52 0.18 4.11 0.22 4.48 0.21 Fearful faces Angry faces Disgusted faces Neutral faces Mean SD Mean SD Mean SD Mean SD Fear odor 4.16 0.30 4.53 0.24 4.13 0.25 4.51 0.22 Neutral odor 4.20 0.26 4.52 0.18 4.11 0.22 4.48 0.21 View Large Table 2. Mean RTs (ms) and SDs for the fearful, angry, disgusted, and neutral facial expressions in fear and neutral odor conditions Fearful faces Angry faces Disgusted faces Neutral faces Mean SD Mean SD Mean SD Mean SD Fear odor 4.16 0.30 4.53 0.24 4.13 0.25 4.51 0.22 Neutral odor 4.20 0.26 4.52 0.18 4.11 0.22 4.48 0.21 Fearful faces Angry faces Disgusted faces Neutral faces Mean SD Mean SD Mean SD Mean SD Fear odor 4.16 0.30 4.53 0.24 4.13 0.25 4.51 0.22 Neutral odor 4.20 0.26 4.52 0.18 4.11 0.22 4.48 0.21 View Large Notably, the main effects were qualified by a significant interaction between odor condition and emotional face images (F(3, 69) = 3.54, P = 0.019, ηp2 = 0.13). We tested 2 specific interaction hypotheses using planned contrasts: According to the first hypothesis, only “fearful faces” would be responded faster in the fear odor condition (vs. neutral odor) compared to angry and disgusted faces. For this specific hypothesis, first, the RT differences were calculated for each of the negative facial image conditions (angry, disgusted, or fearful) by subtracting RT scores under neutral odor condition from fear odor condition, respectively. Then fear difference scores were compared with angry and disgusted faces’ difference scores. Planned contrasts revealed that participants were faster in classifying fear facial images in the fear odor versus neutral odor condition compared to angry and disgusted facial images (F(1, 23) = 7.73, P = 0.011; Figure 2). In fact, specific contrast analyses showed that fear faces were processed significantly faster than disgust faces (F(1,23) = 8.14, P = 0.009); anger faces (F(1,23) = 4.43, P = 0.046); and neutral faces (F(1,23) = 6.98, P = 0.015). There were no differences in the processing time required by disgust, anger, and neutral faces. Hence, the first hypothesis was supported. Figure 2. View largeDownload slide RT difference scores (i.e., RT fear odor condition minus RT neutral odor condition) while participants’ classified negative and neutral facial images. Negative values mean faster responding to facial images when primed with fear odor. Error bars: ±1 SE Figure 2. View largeDownload slide RT difference scores (i.e., RT fear odor condition minus RT neutral odor condition) while participants’ classified negative and neutral facial images. Negative values mean faster responding to facial images when primed with fear odor. Error bars: ±1 SE The second hypothesis proposes that “all the negative facial images” will be responded faster in fear odor condition (vs. neutral odor) compared to neutral facial images. RT difference scores for neutral facial images for both fear and neutral odor conditions were calculated in addition to the difference scores for each of the 3 negative facial images that were calculated previously. Then, difference scores of negative facial images were compared with neutral faces’ difference scores. Planned contrasts showed that participants did not differ in classifying negative facial images in the fear odor versus neutral odor condition compared to neutral images (F(1, 23) = 0.03, P = 0.860). Thus, the second hypothesis was not supported. Facial EMG Right before the start of the emotion categorization task, we examined facial EMG activity to test whether receivers displayed an embodied simulacrum of the fearful experience of the sender. In line with previous research showing emotion correspondence shortly after a first (exploratory) sniff (de Groot et al. 2012, 2014a, 2014b, 2015b), we expected a fearful expression (medial frontalis and corrugator supercilii activity) to emerge within a few seconds after first exposure. Concerning medial frontalis activity (eyebrow lifting muscle), a 2 × 4 repeated measures ANOVA with odor (2 levels: fear and neutral) and time (4 levels: 0–1, 1–2, 2–3, and 3–4 s) as within-subjects levels yielded a significant main effect of odor (F(1, 21) = 12.28, P = 0.002, Hedges’ g = 0.53, 95% CI [0.17–0.88]) and time (F(3, 63) = 4.02, P = 0.011). These main effects were qualified by a significant interaction between odor and time (F(3, 63) = 5.72, P = 0.002; Figure 3A). Planned contrasts revealed no significant differences between fear odor (M = 0.22 μV, SD = 0.17 μV) and neutral odor (M = 0.23 μV, SD = 0.27 μV) in the first second post-baseline (F < 1, P = 0.827); yet, fear odor evoked greater medial frontalis activity than neutral odor on second two, F(1, 21) = 14.14, P = 0.001, g = 0.64, 95% CI [0.25–1.02] (fear: M = 0.48, SD = 0.42; neutral: M = 0.23, SD = 0.32); second three, F(1, 21) = 12.38, P = 0.002, g = 0.63, 95% CI [0.23–1.04] (fear: M = 0.47, SD = 0.42; neutral: M = 0.22, SD = 0.31); and second four post-baseline, F(1, 21) = 6.91, P = 0.016, g = 0.51, 95% CI [0.09–0.93] (fear: M = 0.34, SD = 0.36; neutral: M = 0.16, SD = 0.32). Figure 3. View largeDownload slide Baseline subtracted facial EMG activity indicative of fear as a function of body odor condition (fear and neutral) and time (0–4 s). Error bars = ±1 SE. (A) Mean increase in medial frontalis activity (eyebrow lifting muscle). (B) Mean increase in corrugator supercilii activity (frowning muscle) Figure 3. View largeDownload slide Baseline subtracted facial EMG activity indicative of fear as a function of body odor condition (fear and neutral) and time (0–4 s). Error bars = ±1 SE. (A) Mean increase in medial frontalis activity (eyebrow lifting muscle). (B) Mean increase in corrugator supercilii activity (frowning muscle) Another 2 × 4 repeated measures ANOVA on mean corrugator supercilii activity (frowning muscle) revealed a significant main effect of time, F(3, 60) = 4.88, P = 0.004, but not of odor, F(1, 20) = 2.02, P = 0.170, g = 0.26, 95% CI [−0.11 to 0.63]. Again, the main effects were qualified by a significant interaction between odor and time, F(3, 60) = 4.67, P = 0.008 (Huynh–Feldt correction of degrees of freedom) (Figure 3B). Planned contrasts revealed a significant difference in corrugator supercilii activity in the fear versus neutral condition on second two, F(1,20) = 6.20, P = 0.022, g = 0.57, 95% CI [0.07–1.06] (fear: M = 1.12 μV, SD = 1.05 μV; neutral: M = 0.58 μV, SD = 0.74 μV), whereas no such differences were encountered on second one, F < 1, P = 0.602 (fear: M = 0.48, SD = 0.58; neutral: M = 0.57, SD = 0.79); second three, F(1, 20) = 3.60, P = 0.072 (fear: M = 1.15, SD = 0.95; neutral: M = 0.81, SD = 0.83); and second four, F < 1, P = 0.959 (fear: M = 0.82, SD = 0.87; neutral: M = 0.81, SD = 1.05). Hence, these results are consistent with a line of research (de Groot et al. 2012, 2014a, 2014b, 2015a, 2015b) documenting a gradual increase in medial frontalis muscle activity following exposure to fear odor. The pattern of results on the corrugator muscle was less clear, as during fear odor exposure a frown appeared only briefly (second two). Odor assessment and discrimination In addition, a repeated measures ANOVA was conducted to investigate whether fear and neutral odors were perceived differently in intensity and pleasantness. Results showed that fear odor (M = 2.96, SD = 1.57) was perceived as significantly more intense compared to neutral odor (M = 2.38, SD = 1.44), F(1, 23) = 4.97, P = 0.036, whereas the neutral odor (M = 3.63, SD = 1.02) and fear odor (M = 3.29, SD = 0.95) did not differ in pleasantness, F(1, 23) = 3.54, P = 0.073. To examine if the differences in the intensity ratings of fear and neutral odor affected the results, we performed an analysis with intensity and pleasantness as covariates. We calculated the intensity and pleasantness difference scores (participants’ ratings of fear odor minus neutral odor) and performed repeated measures ANCOVAs on the RT data and facial muscle effects. When both pleasantness and intensity were added as covariates to the RT analysis, the odor × time interaction remained significant (F(3,63) = 3.18, P = 0.030). When only intensity was entered, the odor × time interaction remained significant not only in the case of intensity (F(3,66) = 3.47, P = 0.021) but also in the case of pleasantness (F(3,66) = 3.09, P = 0.033). The discrete emotion contrasts for RT reported earlier also remained significant when both covariates were entered. Thus, when both pleasantness and intensity were entered (F(1,21) = 5.06, P = 0.035), as well as when intensity alone was entered (F(1,22) = 6.02, P = 0.023), or pleasantness alone (F(1,22) = 5.79, P = 0.025), the discreet RT contrasts for emotions remained. The same analyses were conducted for facial muscle effects as can be seen in Table 3. As can be seen from the values presented in Table 1, the pattern of the results remained the same after the potential effects of pleasantness and intensity were entered individually or jointly. Table 3. Covariance analyses for the facial muscle effects Covariate(s) Effect type Frontalis statistics Corrugator statistics Intensity and pleasantness Odor F(1,19) = 9.61, P = 0.006 F(1,18) < 1 Odor × time F(3,57) = 5.10, P = 0.003 F(3,54HF) = 3.50, P = 0.026 Intensity Odor F(1,20) = 7.24, P = 0.014 F(1,19) < 1 Odor × time F(3,60) = 3.86, P = 0.014 F(3,57HF) = 4.33, P = 0.012 Pleasantness Odor F(1,20) = 15.23, P < 0.001 F(1,19) = 1.80, P = 0.196 Odor × time F(3,60) = 7.43, P < 0.001 F(3,57HF) = 3.43, P = 0.026 Covariate(s) Effect type Frontalis statistics Corrugator statistics Intensity and pleasantness Odor F(1,19) = 9.61, P = 0.006 F(1,18) < 1 Odor × time F(3,57) = 5.10, P = 0.003 F(3,54HF) = 3.50, P = 0.026 Intensity Odor F(1,20) = 7.24, P = 0.014 F(1,19) < 1 Odor × time F(3,60) = 3.86, P = 0.014 F(3,57HF) = 4.33, P = 0.012 Pleasantness Odor F(1,20) = 15.23, P < 0.001 F(1,19) = 1.80, P = 0.196 Odor × time F(3,60) = 7.43, P < 0.001 F(3,57HF) = 3.43, P = 0.026 For comparison, the P values of covariate-less analyses were for the main effect of “odor”: P = 0.002 (frontalis) and P = 0.170 (corrugator); and for the interaction “odor × time”: P = 0.002 (frontalis) and P = 0.008 (corrugator). Superscript HF means Huynh–Feldt correction of degrees of freedom was applied. Evidently, adding these covariates did not change the interpretation of the EMG findings. View Large Table 3. Covariance analyses for the facial muscle effects Covariate(s) Effect type Frontalis statistics Corrugator statistics Intensity and pleasantness Odor F(1,19) = 9.61, P = 0.006 F(1,18) < 1 Odor × time F(3,57) = 5.10, P = 0.003 F(3,54HF) = 3.50, P = 0.026 Intensity Odor F(1,20) = 7.24, P = 0.014 F(1,19) < 1 Odor × time F(3,60) = 3.86, P = 0.014 F(3,57HF) = 4.33, P = 0.012 Pleasantness Odor F(1,20) = 15.23, P < 0.001 F(1,19) = 1.80, P = 0.196 Odor × time F(3,60) = 7.43, P < 0.001 F(3,57HF) = 3.43, P = 0.026 Covariate(s) Effect type Frontalis statistics Corrugator statistics Intensity and pleasantness Odor F(1,19) = 9.61, P = 0.006 F(1,18) < 1 Odor × time F(3,57) = 5.10, P = 0.003 F(3,54HF) = 3.50, P = 0.026 Intensity Odor F(1,20) = 7.24, P = 0.014 F(1,19) < 1 Odor × time F(3,60) = 3.86, P = 0.014 F(3,57HF) = 4.33, P = 0.012 Pleasantness Odor F(1,20) = 15.23, P < 0.001 F(1,19) = 1.80, P = 0.196 Odor × time F(3,60) = 7.43, P < 0.001 F(3,57HF) = 3.43, P = 0.026 For comparison, the P values of covariate-less analyses were for the main effect of “odor”: P = 0.002 (frontalis) and P = 0.170 (corrugator); and for the interaction “odor × time”: P = 0.002 (frontalis) and P = 0.008 (corrugator). Superscript HF means Huynh–Feldt correction of degrees of freedom was applied. Evidently, adding these covariates did not change the interpretation of the EMG findings. View Large Finally, we found that participants could not discriminate fear and neutral sweat pads as revealed by a binomial test (proportion under the null hypothesis = 0.125, P = 0.816). General discussion We examined whether human body odors produced under fear and neutral conditions induce a general negative evaluative state or a discrete fear state in a novel facial expression of emotion recognition paradigm. In an experiment, in which participants were exposed to facial images of anger, disgust, fear, and neutrality that change gradually from completely noisy images to fully clear images, the participants task was to stop the gradually emerging image once they thought that they could identify whether the facial image was negative or not. The facial images were presented under 2 odor conditions, namely fear or neutral. It was hypothesized that if the fear odor induces a general negative evaluative state then all negative facial images would be identified faster compared to the neutral odor condition. Alternatively, only fear facial images would be responded to earlier if the fear odor induces a discrete fear state. Our results support “categorical” effects in the perception of negative facial images under fear odor. First, EMG measurement taken during initial odor exposure showed that the fear odor activated the facial muscle specific to fear, medial frontalis, compared to neutral odor. Second, participants recognized fearful faces faster in the fear odor condition compared to the neutral odor condition. Notably, there was no such difference for the other negative facial images. These findings are consistent with the discrete emotional approach. The results suggest that a valence-based dimensional approach does not do justice to the way facial images are processed when fear chemosignals are present. Our data suggest that priming with a “fear” chemosignal in an emotional face identification task facilitates processing the facial expression consistent with the chemosignal rather than “any” negative expression. The significance of the experimental paradigm is that it relies on chemosignals, thus inducing a process that bypasses conscious access and does not involve any “explicit” categorization. Thus, this experimental procedure blocks semantic factors that usually enter a transduction of experiential processes into linguistic representations. The discrete emotion approach to emotional expressions suggests that categorical perception has evolved to enable rapid categorization of emotional states in others that can motivate behavior (e.g., Etcoff and Magee 1992). Our results show that there is discrete emotion activation, as fear odor speeded up categorization of only fearful faces under fear odor condition, but not angry or disgusted faces. Although several studies have reported evidence of discrete emotion perception of facial expressions (e.g., Calder et al. 1996; de Gelder et al. 1997; Campanella et al. 2002), to the best of our knowledge no study has as yet relied on olfactory priming. Methodologically, this study presents a unique combination of using olfactory priming and a novel facial recognition task. The novel facial recognition task has the advantage of using the same facial image over a continuum rather than relying on the more common image morphing method (see Young et al. (1997) for a detailed description) that usually blends original images with a neutral facial image in a 50:50 proportion (e.g., 50% fear and 50% neutral). This novel method, in which full facial images were presented with “identically” reduced clarity across faces, eliminates a potential problem that morphing tasks have, namely the uneven revelation of faces across the 5-s period. The second novelty of the experimental paradigm is the combination of olfactory priming with the facial recognition task and the absence of any “specific” emotion referent linguistic category in the task instructions. The remarkable feature of this study is that the findings suggest the presence of something like an “emotional fingerprint,” namely something that can be regarded as “essentialist,” insofar as the odor that is responsible for the activation of facial muscles and is also related to the corresponding speeded classification of the facial expression. It would seem to be that emotion chemosignals contain distinctive biochemical configurations that induce specific facial expressions and behavioral responses that are consistent with the emotional state involved in the production of the volatile in question. Context paragraph This research flows from two converging research fields that we have been pursuing. The first one is on the “communicative function of human chemosignals” on which we (G.R.S., M.S., J.H.B.d.G.) have been working for more than 7 years. The second one is on the interspecies transmission of emotions via chemosignals on which we (G.R.S., D’A.B.) have been working for the last 2 years. In our recently published paper (D’Aniello et al. 2018) with D’Aniello, we have demonstrated that human chemosignals of happiness and fear transfer these emotional states to Canis lupus familiaris (pet dogs) suggesting an interspecies transferable biochemical fingerprint of human chemosignals. These converging lines of research invited the current research. Examining the processing of stimuli by priming a specific emotional state with chemosignals lends itself readily as an optimal research strategy to examine if a general negative evaluative state or a discrete emotion drives emotion processing. Thus, this study constitutes a continuation and deepening of our “general” inquiry indicating that chemosignals contain and convey information that supersedes individual species. Supplementary material Supplementary material can be found at http://www.chemse.oxfordjournals.org/ Funding This work was predominantly supported by G.R.S.’s research funds and partially by M.A.M.S.’ funds. Additional support for this article came from the Fundação para a Ciência e Tecnologia, Portuguese Science Foundation Grant [IF/00085/2013 to G.R.S.] and both a Talent grant from the Netherlands Organisation for Scientific Research [406-11-078/MaGW] and a Niels Stensen Fellowship were awarded to J.H.B. de Groot. Conflicts of Interests The authors declared that they had no conflict of interest with respect to their authorship or the publication of this article. Acknowledgements We thank Valentina Parma for her helpful comments on an earlier draft of this article, and Lisanne van Houtum for her help with the preparation of olfactory stimuli. As of 1 September 2017, R.G.K. works at the University of Amsterdam. References Barrett LF . 1998 . Discrete emotions or dimensions? The role of valence focus and arousal focus . Cogn Emot . 12 : 579 – 599 . Barrett LF . 2006a . Valence is a basic building block of emotional life . J Res Pers . 40 : 35 – 55 . Barrett LF . 2006b . Solving the emotion paradox: categorization and the experience of emotion . Pers Soc Psychol Rev . 10 : 20 – 46 . Brand G , Millot JL . 2001 . Sex differences in human olfaction: between evidence and enigma . Q J Exp Psychol B . 54 : 259 – 270 . Brody LR , Hall JA . 2000 . Gender, emotion, and expression . In: Lewis M , Haviland-Jones J , editors. Handbook of emotions . New York : Guilford Press . p. 338 – 349 . Calder AJ , Young AW , Perrett DI , Etcoff NL , Rowland D . 1996 . Categorical perception of morphed facial expressions . Vis Cogn . 3 : 81 – 117 . Campanella S , Quinet P , Bruyer R , Crommelinck M , Guerit JM . 2002 . Categorical perception of happiness and fear facial expressions: an ERP study . J Cogn Neurosci . 14 : 210 – 227 . Carver CS . 2001 . Affect and the functional bases of behavior: on the dimensional structure of affective experience . Pers Soc Psychol Rev . 5 : 345 – 356 . Christie IC , Friedman BH . 2004 . Autonomic specificity of discrete emotion and dimensions of affective space: a multivariate approach . Int J Psychophysiol . 51 : 143 – 153 . D’Aniello B , Semin GR , Alterisio A , Aria M , Scandurra A . 2018 . Interspecies transmission of emotional information via chemosignals: from humans to dogs (Canis lupus familiaris) . Animal Cognition . 21 : 67 – 78 . doi: 10.1007/s10071-017-1139-x Darwin C . 1872/1998 . The expression of the emotions in man and animals (with introduction, afterword, and commentaries by P. Ekman) . New York : Oxford University Press . de Gelder B , Teunisse JP , Benson PJ . 1997 . Categorical perception of facial expressions and their internal structure . Cogn Emot . 11 : 1 – 23 . de Groot JH , Semin GR , Smeets MA . 2014a . Chemical communication of fear: a case of male-female asymmetry . J Exp Psychol Gen . 143 : 1515 – 1525 . de Groot JH , Semin GR , Smeets MA . 2014b . I can see, hear, and smell your fear: comparing olfactory and audiovisual media in fear communication . J Exp Psychol Gen . 143 : 825 – 834 . de Groot JHB , Smeets MAM . 2017 . Human fear chemosignaling: evidence from a meta-analysis . Chem Senses . 42 : 663 – 673 . de Groot JH , Smeets MA , Kaldewaij A , Duijndam MJ , Semin GR . 2012 . Chemosignals communicate human emotions . Psychol Sci . 23 : 1417 – 1424 . de Groot JHB , Smeets MAM , Rowson MJ , Bulsing PJ , Blonk CG , Wilkinson JE , Semin GR . 2015a . A sniff of happiness . Psychol Sci . 26 : 684 – 700 . de Groot JHB , Smeets MAM , Semin GR . 2015b . Rapid stress system drives chemical transfer of fear from sender to receiver . PLoS One . 10 : e0118211 . Dimberg U , Petterson M . 2000 . Facial reactions to happy and angry facial expressions: evidence for right hemisphere dominance . Psychophysiology . 37 : 693 – 696 . Doucet S , Soussignan R , Sagot P , Schaal B . 2012 . An overlooked aspect of the human breast: areolar glands in relation with breastfeeding pattern, neonatal weight gain, and the dynamics of lactation . Early Hum Dev . 88 : 119 – 128 . Du S , Tao Y , Martinez AM . 2014 . Compound facial expressions of emotion . Proc Natl Acad Sci USA . 111 : E1454 – E1462 . Ekman P . 1992 . An argument for basic emotion . Cogn Emot . 6 : 169 – 200 . Ekman P . 1994 . Strong evidence for universals in facial expressions: a reply to Russell’s mistaken critique . Psychol Bull . 115 : 268 – 287 . Ekman P , Friesen W . 1978 . Facial action coding system: a technique for the measurement of facial movement . Palo Alto (CA) : Consulting Psychologists Press . Etcoff NL , Magee JJ . 1992 . Categorical perception of facial expressions . Cognition . 44 : 227 – 240 . Farb NA , Chapman HA , Anderson AK . 2013 . Emotions: form follows function . Curr Opin Neurobiol . 23 : 393 – 398 . Faul F , Erdfelder E , Lang AG , Buchner A . 2007 . G*Power 3: a flexible statistical power analysis program for the social, behavioral, and biomedical sciences . Behav Res Methods . 39 : 175 – 191 . Field A . 2009 . Discovering statistics using SPSS . Thousand Oaks (CA) : Sage . Fridlund AJ , Cacioppo JT . 1986 . Guidelines for human electromyographic research . Psychophysiology . 23 : 567 – 589 . Gaby JM , Zayas V . 2017 . Smelling is telling: human olfactory cues influence social judgments in semi-realistic interactions . Chemical Senses . 42 : 405 – 418 . doi: 10.1093/chemse/bjx012 Hess U , Fischer A . 2013 . Emotional mimicry as social regulation . Pers Soc Psychol Rev . 17 : 142 – 157 . Izard CE . 1992 . Basic emotions, relations among emotions, and emotion-cognition relations . Psychol Rev . 99 : 561 – 565 . Jacob S , Hayreh DJ , McClintock MK . 2001 . Context-dependent effects of steroid chemosignals on human physiology and mood . Physiol Behav . 74 : 15 – 27 . Kollndorfer K , Ohrenberger I , Schöpf V . 2016 . Contraceptive use affects overall olfactory performance: investigation of estradiol dosage and duration of intake . PLoS One . 11 : e0167520 . Kret ME , Stekelenburg JJ , Roelofs K , de Gelder B . 2013 . Perception of face and body expressions using electromyography, pupillometry and gaze measures . Front Psychol . 4 : 28 . Langner O , Dotsch R , Bijlstra G , Wigboldus DH , Hawk ST , Van Knippenberg AD . 2010 . Presentation and validation of the Radboud Faces Database . Cogn Emot . 24 : 1377 – 1388 . Larsen RJ , Diener E . 1992 . Promises and problems with the circumplex model of emotion . In: Clark MS , editor. Review of personality and social psychology . Vol. 13 . Newbury Park (CA) : Sage . p. 25 – 59 . Leleu A , Godard O , Dollion N , Durand K , Schaal B , Baudouin JY . 2015a . Contextual odors modulate the visual processing of emotional facial expressions: an ERP study . Neuropsychologia . 77 : 366 – 379 . Leleu A , Demily C , Franck N , Durand K , Schaal B , Baudouin JY . 2015b . The odor context facilitates the perception of low-intensity facial expressions of emotion . PLoS One . 10 : e0138656 . Leppanen JM , Hietanen JK . 2003 . Affect and face perception: odors modulate the recognition advantage of happy faces . Emotion . 3 : 315 – 326 . Levenson RW . 1992 . Autonomic nervous system differences among emotions . Psychol Sci . 3 : 23 – 27 . Levenson RW , Ekman P , Friesen WV . 1990 . Voluntary facial action generates emotion-specific autonomic nervous system activity . Psychophysiology . 27 : 363 – 384 . Leys C , Ley C , Klein O , Bernard P , Licata L . 2013 . Detecting outliers: do not use standard deviation around the mean, use absolute deviation around the median . J Exp Soc Psychol . 49 : 764 – 766 . Lötsch J , Ultsch A , Hummel T . 2016 . How many and which odor identification items are needed to establish normal olfactory function ? Chem Senses . 41 : 339 – 344 . Mitro S , Gordon AR , Olsson MJ , Lundström JN . 2012 . The smell of age: perception and discrimination of body odors of different ages . PLoS One . 7 : e38110 . Murphy FC , Nimmo-Smith I , Lawrence AD . 2003 . Functional neuroanatomy of emotions: a meta-analysis . Cogn Affect Behav Neurosci . 3 : 207 – 233 . Olsson MJ , Lundström JN , Kimball BA , Gordon AR , Karshikoff B , Hosseini N , Sorjonen K , Olgart Höglund C , Solares C , Soop A et al. 2014 . The scent of disease: human body odor contains an early chemosensory cue of sickness . Psychol Sci . 25 : 817 – 823 . Pause BM . 2012 . Processing of body odor signals by the human brain . Chemosensory Perception . 5 : 55 – 63 . doi: 10.1007/s12078-011-9108-2 Pause BM , Ohrt A , Prehn A , Ferstl R . 2004 . Positive emotional priming of facial affect perception in females is diminished by chemosensory anxiety signals . Chem Senses . 29 : 797 – 805 . Peirce JW . 2007 . PsychoPy—psychophysics software in Python . J Neurosci Methods . 162 : 8 – 13 . Penn DJ , Oberzaucher E , Grammer K , Fischer G , Soini HA , Wiesler D , Novotny MV , Dixon SJ , Xu Y , Brereton RG . 2007 . Individual and gender fingerprints in human body odour . J R Soc Interface . 4 : 331 – 340 . Phan KL , Wager T , Taylor SF , Liberzon I . 2002 . Functional neuroanatomy of emotion: a meta-analysis of emotion activation studies in PET and fMRI . Neuroimage . 16 : 331 – 348 . Russell JA . 1980 . A circumplex model of affect . J Pers Soc Psychol . 39 : 1161 – 1178 . Russell JA , Barrett LF . 1999 . Core affect, prototypical emotional episodes, and other things called emotion: dissecting the elephant . J Pers Soc Psychol . 76 : 805 – 819 . Schirmer A , Adolphs R . 2017 . Emotion perception from face, voice, and touch: comparisons and convergence . Trends Cogn Sci . 21 : 216 – 228 . Sela L , Sobel N . 2010 . Human olfaction: a constant state of change-blindness . Exp Brain Res . 205 : 13 – 29 . Semin GR , Groot JH . 2013 . The chemical bases of human sociality . Trends Cogn Sci . 17 : 427 – 429 . Seubert J , Kellermann T , Loughead J , Boers F , Brensinger C , Schneider F , Habel U . 2010 . Processing of disgusted faces is facilitated by odor primes: a functional MRI study . Neuroimage . 53 : 746 – 756 . Susskind JM , Lee DH , Cusi A , Feiman R , Grabski W , Anderson AK . 2008 . Expressing fear enhances sensory acquisition . Nat Neurosci . 11 : 843 – 850 . Stein BE , Stanford TR . 2008 . Multisensory integration: current issues from the perspective of the single neuron . Nat Rev Neurosci . 9 : 255 – 266 . van Boxtel A . 2010 . Facial EMG as a tool for inferring affective states . In: Spink AJ , Grieco F , Krips O , Loijens L , Noldus L , Zimmerman P , editors. Proceedings of measuring behavior 2010 . Wageningen, the Netherlands : Noldus Information Technology . p. 104 – 108 . van Strien JW . 1992 . Classificatie van links- en rechtshandige proefpersonen. [Classification of left- and right-handed participants] . Nederlands Tijdschrift voor de Psychologi . 47 : 88 – 92 . Young AW , Rowland D , Calder AJ , Etcoff NL , Seth A , Perrett DI . 1997 . Facial expression megamix: tests of dimensional and category accounts of emotion recognition . Cognition . 63 : 271 – 313 . Zernecke R , Haegler K , Kleemann AM , Albrecht J , Frank T , Linn J , Wiesmann M . 2011 . Effects of male anxiety chemosignals on the evaluation of happy facial expressions . J Psychophysiol . 25 : 116 – 123 . Zhou W , Chen D . 2009 . Sociochemosensory and emotional functions: behavioral evidence for shared mechanisms . Psychol Sci . 20 : 1118 – 1124 . © The Author(s) 2018. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: email@example.com This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic.oup.com/journals/pages/open_access/funder_policies/chorus/standard_publication_model)
Chemical Senses – Oxford University Press
Published: May 22, 2018
It’s your single place to instantly
discover and read the research
that matters to you.
Enjoy affordable access to
over 18 million articles from more than
15,000 peer-reviewed journals.
All for just $49/month
Query the DeepDyve database, plus search all of PubMed and Google Scholar seamlessly
Save any article or search result from DeepDyve, PubMed, and Google Scholar... all in one place.
Get unlimited, online access to over 18 million full-text articles from more than 15,000 scientific journals.
Read from thousands of the leading scholarly journals from SpringerNature, Wiley-Blackwell, Oxford University Press and more.
All the latest content is available, no embargo periods.
“Hi guys, I cannot tell you how much I love this resource. Incredible. I really believe you've hit the nail on the head with this site in regards to solving the research-purchase issue.”Daniel C.
“Whoa! It’s like Spotify but for academic articles.”@Phil_Robichaud
“I must say, @deepdyve is a fabulous solution to the independent researcher's problem of #access to #information.”@deepthiw
“My last article couldn't be possible without the platform @deepdyve that makes journal papers cheaper.”@JoseServera