Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Distributed affective space represents multiple emotion categories across the human brain

Distributed affective space represents multiple emotion categories across the human brain The functional organization of human emotion systems as well as their neuroanatomical basis and segregation in the brain remains unresolved. Here, we used pattern classification and hierarchical clustering to characterize the organization of a wide array of emotion categories in the human brain. We induced 14 emotions (6 ‘basic’, e.g. fear and anger; and 8 ‘non- basic’, e.g. shame and gratitude) and a neutral state using guided mental imagery while participants’ brain activity was measured with functional magnetic resonance imaging (fMRI). Twelve out of 14 emotions could be reliably classified from the haemodynamic signals. All emotions engaged a multitude of brain areas, primarily in midline cortices including anterior and posterior cingulate gyri and precuneus, in subcortical regions, and in motor regions including cerebellum and premotor cortex. Similarity of subjective emotional experiences was associated with similarity of the corresponding neural activation patterns. We conclude that different basic and non-basic emotions have distinguishable neural bases characterized by specific, distributed activation patterns in widespread cortical and subcortical circuits. Regionally differentiated engagement of these circuits defines the unique neural activity pattern and the corresponding subjective feeling associated with each emotion. Key words: emotion; fMRI; MVPA; pattern classification Introduction Adolphs, 2017; Barrett, 2017). This discussion revolves around the The organization of human emotion systems is currently a topic number of distinct emotion systems and the organization of of lively debate (Hamann, 2012; Lindquist et al., 2012; Kragel and human emotion circuits in the brain. Most research on specific LaBar, 2014; Meaux and Vuilleumier, 2015; Saarima ¨ki et al., 2016; emotion categories has focused on ‘primary’ or ‘basic’ emotions Received: 8 June 2017; Revised: 25 February 2018; Accepted: 28 March 2018 V C The Author(s) (2018). Published by Oxford University Press. This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/ licenses/by-nc/4.0/), which permits non-commercial re-use, distribution, and reproduction in any medium, provided the original work is properly cited. For commercial re-use, please contact journals.permissions@oup.com Downloaded from https://academic.oup.com/scan/article-abstract/13/5/471/4956228 by Ed 'DeepDyve' Gillespie user on 21 June 2018 472 | Social Cognitive and Affective Neuroscience, 2018, Vol. 13, No. 5 (usually anger, fear, disgust, happiness, sadness and surprise). unclear whether similar, distinct circuits as previously observed According to the basic emotion theories, emotions have been for basic emotions would also support these types of emotions. shaped during the evolution to serve distinct survival functions Here, we investigated the neural underpinnings of multiple via distinct neural circuits and physiological systems (Panksepp, basic and non-basic emotion categories. We induced 14 emo- 1982; Ekman, 1992, 1999; Damasio, 1999). In turn, constructivist tions in participants by guided mental imagery while their brain theories consider specific emotions emerging from the interaction activity was measured with fMRI. First, to examine whether dif- of fundamental processes shared across psychological domains ferent emotions have distinguishable brain bases, we employed and, thus, not unique to emotions alone (see, e.g. Russell, 2003; pattern classification. Second, using hierarchical clustering, we Lewis and Liu, 2011; Hamann, 2012; Cunningham, 2013). Basic examined the similarity of neural substrates of different emo- emotion theories emphasize similarities in emotion mechanisms tions and tested how this similarity was related to how simi- across individuals, while constructivist theories emphasize inter- larly these emotions are experienced. Third, we mapped and within-individual variability. Several human neuroimaging representation of different emotions in the core emotion pro- studies support the view that at least the canonical basic emo- cessing regions of the brain, by characterizing the emotion- tions have distinguishable neural bases, as they are associated dependent neural response profiles using univariate analyses with discernible neural activity patterns as measured by BOLD- and cumulative activation mapping. functional magnetic resonance imaging (fMRI) (e.g. Kragel and LaBar 2014; Saarima ¨ki et al., 2016; for a review, see Kragel and LaBar, 2016; Nummenmaa & Saarima ¨ ki, 2017). Materials and methods Yet, a wide array of other emotions, including ‘secondary’ or Participants ‘social’ emotions (see reviews and proposed taxonomies in Damasio, 1999; Adolphs, 2002a), also serve adaptive survival Twenty-five female volunteers (aged 19–38, mean age functions and are characterized by distinctive facial expressions 23.6 years) participated in the experiment. All were right- (Baron-Cohen et al., 2001; Shaw et al., 2005), bodily sensations handed, neurologically healthy and with normal or corrected- (Nummenmaa et al., 2014a), and neural activity patterns (Kassam to-normal vision, and gave written informed consent according et al. 2013; Kragel and LaBar, 2015). Nevertheless, the psycho- to the Declaration of Helsinki. The Institutional Review Board of logical and neural mechanisms of these non-basic emotions, as Aalto University approved the experimental protocol. Female well as their commonalities or differences relative to basic emo- participants were chosen to maximize the power of the experi- tions, remain largely unresolved (Ekman, 1999; Ekman and ment, as when compared to males, females typically experience Cordaro, 2011; Adolphs, 2002b). These emotions may involve and portray more intensive emotional reactions (see review in more elaborate cognitive representations acquired through ex- Fischer and LaFrance, 2015), and show greater brain responses perience, education, and social norms (Panksepp and Watt, 2011), during emotions (e.g., Hofer et al., 2006) and stronger facial mim- and hence, recruit brain systems partly distinct from those impli- icry as indexed by EMG (Grossman and Wood, 1993). cated in more ‘primitive’ and possibly innate basic emotions. It is thus possible that also non-basic emotions may have distinguish- Stimuli able neural bases which would, however, be discernible from that of basic emotions. Accordingly, the specific number of basic emo- The stimuli for the guided affective imagery were sixty 5–20-s tions has been debated and the distinction between basic and long narratives describing an antecedent event triggering prom- non-basic emotions has been questioned (e.g. Russell, 2003). For inently one emotional state. Narrative-based guided imagery is example, Adolphs (2002b) proposed that affective processes could known to be an efficient emotion induction technique that en- be understood at various taxonomies depending on the level of gages reliably the brain’s emotion circuits (Costa et al., 2010; analysis. The basic emotion conceptualization would be particu- Nummenmaa et al., 2014b) and results in strong subjective emo- larly relevant for primary-process core affects governed by phylo- tional feelings. Each narrative elicited primarily one out of pos- genetically old emotion circuits especially within the subcortical sible 14 emotions, or a neutral emotional state. Targeted structures (Panksepp and Watt, 2011). emotions included six basic or primary emotions (anger, fear, A set of core emotion processing regions is consistently disgust, happiness, sadness and surprise) and eight social or engaged during multiple emotions. These include cortical mid- secondary emotions (shame, pride, longing, guilt, love, con- line regions (Peelen et al., 2010; Chikazoe et al., 2014; Trost et al., tempt, gratitude and despair). The narratives included a short 2012), somatomotor regions (Adolphs et al., 2000; de Gelder et al., description of a situation that participants were instructed to 2004; Nummenmaa et al., 2008, 2012; Pichon et al., 2008), as well imagine would happen to them, for instance, ‘It is a late night on as amygdala, brainstem and thalamus (Adolphs, 2010; Damasio a dimly-lit parking lot. Your car stands alone in a dark corner. and Carvalho, 2013; Kragel and LaBar, 2014). These regions serve Suddenly you hear a gun shot behind you.’ (fear), or ‘Your lover is as candidate areas containing distinct neural signatures for dif- lying next to you on a bed. You look into his eyes when he gently ferent basic emotions (Peelen et al., 2010; Saarima ¨ki et al., 2016). touches your hair and bends to kiss your lips.’ (love). Based on an Yet, it is currently unclear whether these regions also code for online pilot experiment (see Supplementary Material), we se- other, non-basic emotions. Prior studies using univariate ana- lected four narratives per category (total of 60 narratives; see lyses have quantified neural responses to emotions such as re- Supplementary Table S1 for all narrative stimuli) to be included gret (Coricelli et al., 2005; Eryilmaz et al., 2011), guilt (Wagner in the fMRI experiment. et al., 2011), pride (Takahashi et al., 2008; Zahn et al., 2009; The selected narratives were spoken by a female speaker Simon-Thomas et al., 2012), rejoice (Chandrasekhar et al., 2008) using neutral prosody without cues for the affective content of and maternal love (Bartels and Zeki, 2004), as well as aesthetic the narrative. The background noise in the recording room was feelings such as wonder or nostalgia (Vuilleumier and Trost, recorded and equalized (bandpass filter 50–10 000 Hz) with 2015). Yet, these studies (except Kassam et al., 2013; Kragel and Apple Logic Pro 9 (Apple Inc.), and gate and compressor were LaBar 2014) have usually compared brain activation differences used to attenuate the background noise during moments of si- between two emotions in a univariate fashion. Thus, it remains lence and slightly compress the voice dynamic to limit the Downloaded from https://academic.oup.com/scan/article-abstract/13/5/471/4956228 by Ed 'DeepDyve' Gillespie user on 21 June 2018 H. Saarima ¨ki et al. | 473 variance of the sound power. The loudness of each narrative et al., 2002) and non-brain matter was removed using BET was normalized according to the peak value. (Smith, 2002). High-pass temporal filtering was applied using The recorded narratives were divided into 4 runs of 15 narra- Gaussian-weighted least-squares straight line fitting with sigma tives, with one narrative per category in each run. The runs of 100 s. Participant-wise gray matter masks were generated by contained the same narratives for all participants but the pres- segmenting the T1-weighted images into gray and white matter entation order was randomized within each run and for each and cerebrospinal fluid using the FAST segmentation tool participant. During fMRI, the four sets of recorded narratives (Zhang et al., 2001) and transforming the masks to the native were all presented twice thus resulting in altogether eight runs. space using FLIRT (Jenkinson et al., 2002) with nine degrees of Each run lasted 7–8 min and consisted of 15 trials. A trial started freedom. The gray matter maps were subsequently thresholded with a fixation cross shown for 0.5 s, followed by a 2-s presenta- using intensity threshold> 0.5 to create participant-specific tion of a word describing the target emotion (anger, fear, disgust masks. This threshold was chosen to include those voxels with and so forth) to prepare participants for the forthcoming emo- a higher probability of belonging to the gray matter and, subse- tional imagery task and thus to make the induction more quently, the masks were visually inspected to make sure most powerful. Next, the narrative was spoken out, followed by a 10-s gray matter was included. On average, the gray matter mask imagery period. The trial ended with a 10-s wash-out period to included 16 000 voxels. counter for possible carryover effects. Participants were in- For univariate general linear model (GLM) analysis, the pre- structed to try to get involved in the narratives by imagining the processed functional data were registered to 2-mm Montreal described events as happening to themselves and to experience Neurological Institute (MNI) 152 standard space template using the corresponding feeling as vividly as possible. FLIRT (Jenkinson et al. 2002) and 9 degrees of freedom, and Auditory stimuli were delivered through Sensimetrics S14 smoothed using a Gaussian kernel with FMWH 8. insert earphones (Sensimetrics Corporation, Malden, MA, USA). Sound was adjusted for each participant to be loud enough to be Multivariate pattern classification within participants heard over the scanner noise. The visual stimuli were delivered using Presentation software (Neurobehavioral Systems Inc., The classification of emotion categories was performed using Albany, CA, USA), and they were back-projected on a semitrans- the Princeton multi-voxel pattern analysis (MVPA) toolbox parent screen using a 3-micromirror data projector (Christie X3, (https://pni.princeton.edu/pni-software-tools/mvpa-toolbox) Christie Digital Systems Ltd., Mo ¨ nchengladbach, Germany), and in Matlab 2012b using each participant’s data in native space. from there via a mirror to the participant. A separate classifier was trained for each participant and, after After the scanning, participants were presented with the nar- all steps, the classification results were averaged across the par- ratives again. For each narrative, the participants rated the felt ticipants. After preprocessing, voxels outside gray matter were intensity of each of the 14 emotions (plus neutral state) using a masked out using the participant-specific gray matter masks continuos scale arbitrarily ranging from 0 to 1. (for details, see and the functional data from each run were standardized to Supplementary Material). We ran k-means clustering on the re- have a mean of zero and variance of one. Next, each partici- sulting intensity profiles (i.e. intensity ratings per category for pant’s data were divided into training (N1 runs) and testing each narrative) to test whether the 15 target categories could be sets (the remaining run). Feature selection was performed using identified from the emotion-wise intensity ratings, thus reveal- one-way ANOVA (testing for the main effect of emotion cat- ing whether or not the narratives elicited distinct categorical egory) for the training set to select the voxels with a significant emotions. Also, to test the similarity of narratives belonging to (P< 0.05) main effect for emotion, i.e. to select the voxels whose the same category, we calculated the Euclidean distance be- mean activation differed between at least some of the 15 pos- tween the intensity profiles of each pair of narratives. sible emotion conditions. The feature selection preserved on Finally, participants rated the similarity of all possible pairs of average (across cross-validation folds and participants) 41% of the emotions induced during the experiment (a total of 105 pairs) the voxels. Hemodynamic lag was corrected for by convolving using a direct comparison method. The participants were shown the boxcar category regressors with the canonical double one pair of emotion words at the time and asked to rate how sub- gamma HRF function and thresholding the convolved regres- jectively similar they experience the emotions (ranging from no sors using a sigmoid function to return the regressors to the bin- similarity [0] to full similarity [1]). Based on the ratings, we ex- ary form. The classification was performed on all the tracted the recollected experiential similarity matrix for each par- standardized, HRF-convolved fMRI volumes from the 10 s im- ticipant individually and averaged these across all participants. agery period following the narrative (treating all single time points per category as samples for that category; median 6 vol- umes per one emotion category in one run) to extract only emo- MRI data acquisition and preprocessing tion-related brain activity, and to minimize activity related to MRI data were collected on a 3-T Siemens Magnetom Skyra scan- the acoustic and semantic features of the stimuli. Thus, each of ner at the Advanced Magnetic Imaging center, Aalto University, the eight runs included on average 90 TRs (5–6 TRs per category) using a 20-channel Siemens volume coil. Whole-brain functional that were used in the classification. A linear neural network scans were collected using a whole brain T2*-weighted EPI se- classifier without hidden layers was trained to recognize the quence with the following parameters: 33 axial slices, TR¼ 1.7 s, correct emotion category out of 15 possible ones (multiclass TE¼ 24 ms, flip angle¼ 70 , voxel size¼ 3.13.14.0 mm , matrix classification, see Polyn et al. 2005 for details). Naı ¨ve chance size¼ 646433, FOV¼ 256256 mm. A custom-modified bipolar level, derived as a ratio of 1 over the number of categories, was water excitation radio frequency (RF) pulse was used to avoid 6.7%. A leave-one-run-out cross-validation was performed, signal from fat. High-resolution anatomical images with iso- thus, dividing the data into all possible N1 run combinations tropic 111mm voxel size were collected using a T1-weighted and repeating the classification pipeline for each such cross- MP-RAGE sequence. validation fold, and the participant-wise classification accuracy Data were preprocessed using FSL 5.0 (Jenkinson et al., 2012). was calculated as an average percentage of correct guesses Motion correction was performed using MCFLIRT (Jenkinson across all the cross-validation folds. To test whether Downloaded from https://academic.oup.com/scan/article-abstract/13/5/471/4956228 by Ed 'DeepDyve' Gillespie user on 21 June 2018 474 | Social Cognitive and Affective Neuroscience, 2018, Vol. 13, No. 5 classification accuracy exceeded chance level, we used permu- each of the 14 emotions against the neutral baseline. First level tation tests to simulate the probability distribution of the classi- analysis was performed in SPM 12 (wwwl.fil.ion.ucl.ac.uk/spm/) fication by shuffling the category labels of the training set to obtain individual contrast maps and second level analysis (across training set runs) and re-running the whole classifica- was then performed with FSL randomize with the threshold free tion pipeline, repeated 1000 times for each participant. FDR cor- cluster enhancement option as it implements the least biased rection at P < 0.05 was used for multiple comparisons. way to control for multiple comparisons (Eklund et al. 2016; N¼ 5000 permutations). Emotion-wise t maps were then quali- tatively summarized across emotions as a cumulative map Hierarchical clustering where each voxel shows the number of statistically significant emotions, at the cluster corrected level of P< 0.05. We next investigated the similarities between emotions using the category confusions from the whole-brain classification. Clustering was performed for exploratory purposes to charac- Visualization of emotion clusters in the brain terize the similarities in neural and recollected experiential Finally, to summarize and visualize where emotion-wise activa- data, this however does not provide statistical evidence for the tion patterns were located, we mapped the three principal clusters similarity structures. Nevertheless, similarity between neural obtained with hierarchical clustering on the cortical and subcor- and experiential data would suggest similar representational tical maps using R, G, B color space. The last cluster containing sur- segregation in the neural code and subjective feelings. From the prise and neutral was plotted separately (see Supplementary group-averaged confusion matrix, we calculated a distance ma- Figure S6) given it contained neutral-like states only. For each emo- trix by taking the category confusion vectors for each pair of tion, we took the unthresholded second level t maps obtained emotions and by calculating the Euclidean distance between from the GLM analysis, summed them for emotions belonging to these vectors (see Reyes-Vargas et al., 2013). We then employed the same cluster, and assigned the summed values to the corres- hierarchical cluster analysis in Matlab to investigate how differ- ponding R, G, B channels. The color channels were subsequently ent emotions cluster together based on their neural similarities. visualized in MNI space. Consequently, the RGB color at each voxel The agglomerative hierarchical cluster tree was calculated on reflects the cluster distribution of that voxel, and can be used for the distance matrix using linkage function with ‘complete’ option localizing brain regions contributing to different emotions. (i.e. the furthest distance method). Finally, we constructed the clusters from the cluster tree (cluster function) and chose the solution that minimized the number of categories while keep- Results ing at least two emotions per category. Note that the clustering Behavioral results was selected solely for data visualization rather than for statis- tical inference. To visualize the similarities in subjective and Behavioral ratings showed that the narratives successfully eli- neural organization of emotions, we extracted the clusters in cited reliable and strong target emotions (intensity profiles in both neural and recollected experiential data, and subsequently Figure 1A; mean intensities per target category: pride 0.80, long- plotted the cluster solutions using alluvial diagrams (Rosvall ing for 0.77, happiness 0.84, gratitude 0.84, love 0.90, surprise 0.76, and Bergstrom, 2010; www.mapequation.org). neutral 0.88, disgust 0.85, sadness 0.96, fear 0.89, shame 0.75, We then investigated to which extent the neural similarities anger 0.68, guilt 0.91, contempt 0.66, despair 0.87). In k-means between different emotional states correspond to their recol- clustering, the accuracy to assign a narrative to the correct target lected experiential (subjectively felt) differences. Experiential category based on its intensity profile was 97% (against the similarity matrices were calculated based on pairwise similarity chance level 6.7%). Also, the narratives within each category had ratings of emotions and averaged over the participants. highly similar intensity profiles (Figure 1B); i.e. narratives belong- Subsequently, the mean neural and experiential similarity ing to the same category elicited similar emotions. matrices were correlated using Spearman’s rank correlation co- efficient. The P level for the Spearman test was obtained with a Classification of basic and non-basic emotions permutation test by shuffling the neural matrix and re- calculating the correlation for 5000 times. Mean classification accuracy across the 14 emotions and the Finally, we tested whether basic and non-basic emotions neutral state was 17% (against naı ¨ve chance level of 6.7%; 95th generally differ in how categorical their experience is recog- percentile of the permuted classification accuracy distribution nized. To do so, we calculated prototypical experience scores for was 8.4%). After correcting for multiple comparisons, the classi- each emotion, defined as the sum of off-diagonal elements in fication performance was above a permutation-based signifi- the experiential and neural similarity matrices separately. This cance level for all emotions except shame and longing (P< 0.05, analysis was based on the assumption that the off-diagonal Figure 2; see Supplementary Table S2 for effect sizes and elements represent confusions across emotions, thus indicating Supplementary Figure S1 for a confusion matrix). On average, a lack of sharp categorical representation within a group of basic emotions (anger, disgust, fear, happiness, sadness, sur- emotions. In other words, we tested whether emotion confu- prise) could be classified more accurately than the non-basic sions were systematically different for basic and non-basic emotions (26% vs 15%, respectively, t(24) ¼ 7.39, P< 0.0001, types, in both subjective and neural data. The resulting ‘prototy- Cohen’s h¼ 0.16), while we found no significant differences in picality’ scores (i.e. off-diagonal scores) for basic and non-basic classifier accuracies between positive and negative emotions emotions were then compared using Mann–Whitney U test. (17% vs 14%, respectively, t(24) ¼ 1.52, P¼ 0.075). We also trained separate classifiers for each a priori selected region-of-interest (see Supplementary Methods) which showed that classification Regional effects in GLM accuracies were above chance level in frontal ROIs, especially in To investigate the overall effect of any emotion on brain activ- frontal pole, and in somatomotor ROIs, especially for pre- and ity, we first ran GLM to compare all emotions together against post-central gyri, yet did not exceed that of the whole-brain the neutral baseline, and then ran separate GLMs to compare classification (Supplementary Figure S2). Downloaded from https://academic.oup.com/scan/article-abstract/13/5/471/4956228 by Ed 'DeepDyve' Gillespie user on 21 June 2018 H. Saarima ¨ki et al. | 475 We also ran separate classifiers across positively and nega- tively valenced emotion categories to test whether classification within a similarly valenced superordinate category is compar- able to that of classification with all categories. Average classi- fier accuracy for positive emotions was 30.5% (naı ¨ve chance level 20%) and all emotions could be classified with above- chance level accuracy: love (36.9%), happiness (31.5%), gratitude (30.8%), pride (28.5%), and longing (24.9%). Average classifier ac- curacy for negative emotions was 22.4% (naı ¨ve chance level 12.5%) and above chance level accuracy was observed for each emotion: sadness (28.8%), disgust (27.8%), despair (25.7%), fear (25.1%), anger (20.5%), contempt (19.0%), and shame (15.9%). Similarity in neural basis corresponds to experienced similarity Organization of recollected experiential similarity matrices derived from behavioral ratings was significantly associated with the neural similarity matrices derived from confusion matrices from whole-brain classification (r¼ 0.37, P¼ 0.0048; Figure 3A). Clustering of confusion matrices divided the emo- tions into four clusters (Figure 3A): (1) happiness, pride, grati- tude, love, and longing; (2) surprise and neutral; (3) disgust, sadness, fear, and shame; and (4) anger, contempt, guilt, and despair, which mostly corresponded to the four main clusters of experiential similarities (Figure 3B): (1) happiness, pride, grati- tude, and love; (2) surprise and neutral; (3) longing and sadness; and (4) disgust, anger, contempt, shame, guilt, fear, and despair. To test whether basic and non-basic emotions generally dif- fer in how prototypical their experience is, we calculated a prototypical experience score for each emotion (i.e. rate of con- fusions with other emotions) and then compared the average prototypicality scores for basic and non-basic emotions. There were no differences between basic and non-basic in either the neural or experiential data. Fig. 1. The stimuli consisted of 60 brief (5–20 s) narratives that induced 14 emo- tional states and a neutral state. (A) Participants rated on a scale from 0 to 1 how strongly each emotion was elicited by the narrative. The coloring indexes the Affective space in the brain mean intensity for experiencing each emotion for each narrative. (B) Based on the emotion intensity ratings, we calculated the similarity of emotional experi- To investigate the brain regions generally activated and deacti- ence between narratives by using Euclidean distances. vated by our emotion stimuli, we contrasted the brain activity Fig. 2. Means and standard errors for emotion-wise whole-brain classification accuracy. Dashed line represents chance level (6.7%). Colors reflect the clusters formed on the basis of experienced similarity of emotions (see Figure 3). Downloaded from https://academic.oup.com/scan/article-abstract/13/5/471/4956228 by Ed 'DeepDyve' Gillespie user on 21 June 2018 476 | Social Cognitive and Affective Neuroscience, 2018, Vol. 13, No. 5 Fig. 3. (A) Left: Neural similarity matrix extracted from the classifier confusion matrix. The similarity matrix was created by calculating the Euclidean distance between each pair of emotions based on their category confusion vectors. Right: Experiential similarity matrix based on pairwise similarity ratings for emotions elicited by the narratives. (B) Alluvial diagram showing the similarity of hierarchical cluster structure of the experiential and neural similarities. Coloring of the emotion categories is based on the clusters in the neural similarity matrix. related to all emotions with the neutral condition (Figure 4 out- cumulative deactivation map (Figure 4B) showed that most lines; see also Supplementary Figure S4). The areas activated by emotions involved deactivation of auditory cortex, frontal areas emotion in general included pre-motor cortex, thalamus, insula including superior frontal gyri, right middle frontal gyrus, and and putamen. Deactivations were observed in visual and audi- left inferior frontal gyrus and parietal areas including supramar- tory cortices, precuneus, PCC, right anterior PFC and right lateral ginal gyrus. parietal areas. Finally, to visualize where specific emotions are encoded in To reveal the brain regions contributing most consistently to the brain, we mapped the clusters resulting from the hierarch- different emotions, we constructed cumulative activation and ical clustering on cortical and subcortical surfaces (Figure 5). deactivation maps of emotion-driven hemodynamic responses All emotions activated areas in the visual cortex, ACC, right (Figure 4; corresponding effect size maps in Supplementary temporal pole, supplementary motor area and subcortical re- Figure S4; the emotion-wise statistical t maps are available in gions. In addition, positive emotions belonging to Cluster 1 http://neurovault.org/collections/TWZSVODU). These maps re- (happiness, pride, gratitude, love, longing) were more promin- veal how many of the 14 possible emotional states activated ent in anterior frontal areas including vmPFC. Negative basic each voxel. The cumulative activation map (Figure 4A) showed emotions from Cluster 2 (disgust, sadness, fear, shame) acti- that most emotions involved activation of midline regions vated especially insula, supplementary motor area and specific including anterior cingulate cortex (ACC) and precuneus, as well parts of subcortical structures. Negative social emotions belong- as subcortical regions including brain stem and hippocampus, ing to Cluster 3 (anger, contempt, guilt, despair; see motor areas including cerebellum, and visual cortex. The Supplementary Table S1 for the story stimuli targeting different Downloaded from https://academic.oup.com/scan/article-abstract/13/5/471/4956228 by Ed 'DeepDyve' Gillespie user on 21 June 2018 H. Saarima ¨ki et al. | 477 Fig. 4. (A) Cumulative activation map showing the cumulative sum of binarized t maps (P< 0.05, cluster-corrected) across each emotion vs neutral condition. Outline shows the GLM results for all emotions contrasted against the neutral condition (P< 0.05, cluster-corrected). (B) Cumulative deactivation map showing the cumulative sum of binarized t maps (P< 0.05, cluster-corrected) across neutral vs each emotion. Outline shows the GLM results for the neutral condition contrasted against all emotions (P< 0.05, cluster-corrected). emotions) were most prominent in left insula and the adjacent evidenced by the correspondence between neural and recol- frontal areas. Finally, surprise (Cluster 4, Supplementary Figure lected experiential similarity between emotions. S6) activated especially parts of auditory cortex, supplementary motor areas, and left insula. Different emotions are characterized by distinct neural signatures Discussion Altogether 12 emotions (excluding longing for and shame) out Our results reveal that multiple emotion states have distinct of the 14 included in the study could be reliably classified from and distributed neural bases, as evidenced by the above chance- the fMRI signals. Our results extend previous studies, which level classifier performance for all emotions, except longing and have shown classification of specific emotional states usually shame. Together with the spatial location of emotion- focusing on the basic emotions only or a subset of these dependent brain activation, this suggests that a multitude of (Ethofer et al., 2009; Peelen et al., 2010; Said et al., 2010; Kotz et al., different emotions are represented in the brain in a distinguish- 2013; for a review, see Kragel and LaBar, 2014). While the ‘clas- able manner, yet in partly overlapping regions: each emotion sic’ basic emotions have attracted most attention in psycho- state likely modulates different functional systems of the brain logical and neurophysiological studies, they constitute only a differently, as shown by distinct patterns measured with BOLD– small portion of the emotions humans universally experience fMRI, and the overall configuration of the regional activation (Edelstein and Shaver, 2007). Furthermore, accumulating behav- patterns defines the resulting emotion. While a set of ‘core’ ioral evidence suggests that other, non-basic emotions are also emotion processing areas in cortical midline regions, motor characterized by distinctive features in facial expressions areas, sensory areas, and subcortical regions are engaged during (Baron-Cohen et al., 2001; Shaw et al., 2005), bodily changes practically all emotions, the relative engagement of these areas (Nummenmaa et al., 2014a), and physiology (Kreibig, 2010; varies between emotions. This unique neural signature of each Kragel and LaBar, 2013). The present data corroborate these emotion might relate to the corresponding subjective feeling, as findings by showing that also emotions not considered as ‘basic’ Downloaded from https://academic.oup.com/scan/article-abstract/13/5/471/4956228 by Ed 'DeepDyve' Gillespie user on 21 June 2018 478 | Social Cognitive and Affective Neuroscience, 2018, Vol. 13, No. 5 Fig. 5. Activation maps showing the summed uncorrected t maps for each cluster obtained from the hierarchical clustering analysis in (A) cortical regions and (B) sub- cortical regions. Colors represent the three clusters: positive (red), negative basic (green), and negative social (blue) emotions. may each have distinctive brain activation patterns. The local readily reveal the actual neural organization of each emotion brain activity patterns underlying different emotions are most system, as the pattern classification only tells us that, on aver- probably to some extend variable across participants and reflect age and at the level measurable with BOLD–fMRI, the cortical/ individual responses, as brains are always intrinsically shaped neuronal pattern underlying each category differ enough to be by individual development and experiences. separated, whereas localizing the actual source of differences is If we consider discrete emotion systems as wide-spread, dis- more difficult. Therefore, we have complemented the pattern tinct neural activation distinct to each emotion state, successful classification analysis with visualization of different emotion pattern classification of brain states across emotions would pro- categories using GLM and clustering. Furthermore, the pattern vide support for separate neural systems for each emotion. In recognition techniques employed in this study cannot provide turn, constructivist emotion theories suggest that all emotions causal evidence for the existence of basic emotion systems: are generated by a shared set of fundamental functional sys- even if we find emotion-specific activation patterns for specific tems which are however not specific to emotional processing emotions, it does not prove that these patterns are strictly ne- per se (Kober et al., 2008; Lindquist et al., 2012). The present data cessary for the corresponding emotional state (Nummenmaa show that different emotions are associated with granular acti- and Saarima ¨ ki, 2017). The causality issues can be resolved, for vation changes across multiple functional systems, and their instance, in lesion studies or with brain stimulation techniques. spatially distributed configuration ultimately defines the spe- Moreover, the current analyses do not directly answer whether cific emotion at both psychological and behavioral levels the data are better described by discrete versus dimensional (Meaux and Vuilleumier, 2015). For instance, two emotions models of emotion. might share their somatosensory representations, but underly- If basic emotions were somehow ‘special’ or different from ing interoceptive representations could be different. Thus, the non-basic emotions at the neural level, we should observe (1) general configuration of the central and peripheral nervous sys- distinct neural activation patterns for basic emotions but not tem leads to distinct emotion states. for non-basic emotions, or (2) different (or perhaps additional) However, we stress that the current data cannot resolve neural systems underlying basic and non-basic emotions. Our whether these functional signatures of distinct emotions would classification results and cumulative maps show that both basic be necessary for each emotion, or whether the presently and non-basic emotions could be classified accurately, and they observed structure of the emotions is optimal, as the classifica- elicited activation in largely overlapping brain areas. On aver- tion solution is contingent on the employed a priori category age, classification accuracies were higher for basic emotions labels. Further, it must be noted classification analysis does not than for non-basic emotions. This suggests that while there Downloaded from https://academic.oup.com/scan/article-abstract/13/5/471/4956228 by Ed 'DeepDyve' Gillespie user on 21 June 2018 H. Saarima ¨ki et al. | 479 exists partially exclusive neural codes for a multitude of emo- et al., 2014), participate in self-relevant and introspective pro- tion states, as those investigated here, the canonical basic emo- cessing (Northoff and Bermpohl, 2004; Northoff et al., 2006), and tions are more discrete than the complex social emotions integrate information of internal, mental, and bodily states included in this study, suggesting that there are both universal (Northoff et al., 2006; Buckner and Carroll, 2007). We also found and experience-dependent components to emotions. Also, clas- consistent emotion-dependent activity in the brainstem, sification accuracies within single regions showed that only including periaqueductal grey, pons and medulla, for almost all basic emotions could be distinguished in some areas, including emotions. This activation might reflect the control of autonomic somatomotor regions (insula and supplementary motor area), nervous system’s reactions to different emotions (Critchley midline areas (PCC and ACC), and inferior frontal gyrus et al., 2005; Linnman et al., 2012) and/or covert activation of par- (Supplementary Figure S3). Another potential explanation for ticular motor programs (Blakemore et al., 2016). Also, subcortical the differences in classification accuracies between basic and regions including amygdala and thalamus showed distinct ac- non-basic emotions is that maybe there is a clearer cultural tivity patterns that differed between clusters. Both of these re- understanding or prototypical experience related with basic gions are related to salience processing and emotional arousal emotions, manifested as clearer patterns underlying these emo- (Anders et al., 2004; Adolphs, 2010; Damasio and Carvalho, 2013; tions. To test this hypothesis, we calculated prototypical experi- Kragel and LaBar, 2014) and show specific activation patterns ence scores for each emotion, defined as the sum of off- for basic emotions (Wang et al., 2014), findings that we now ex- diagonal elements in recollected experiential and neural tent also to non-basic emotions. similarity matrices separately, and compared the average proto- Second, somatomotor areas including premotor cortex, cere- typicality scores for basic and non-basic emotions. This allowed bellum (including vermis and the anterior lobe), globus pallidus, us to quantify the specificity vs confusion of recognition for dif- caudate nucleus, and posterior insula were activated during ferent emotion categories. There were no differences between most emotions, but according to the cluster visualizations espe- basic and non-basic in either the neural or recollected experien- cially during the processing of emotions that have a strong im- tial data, suggesting that both basic and non-basic emotions pact on action tendencies and avoidance-oriented behaviors have equally distinct neural and experiential underpinnings. (fear, disgust, sadness, shame, surprise; Frijda et al. 1989). These areas are engaged during emotion perception (Nummenmaa et al., 2008, 2012; Pichon et al., 2008), emotion regulation Correspondence between neural and phenomenological (Schutter and van Honk, 2009), and somatomotor processing organization of emotions and action preparation related to emotional processing (Kohler et al., 2002; Wicker et al., 2003; Mazzola et al., 2013). An explorative hierarchical clustering of emotion-specific neu- Third, anterior prefrontal cortex was activated especially ral patterns identified four clusters in the neural data that cor- during positive emotions (happiness, love, pride, gratitude, respond to positive emotions (pride, longing, happiness, longing) according with previous research linking anterior pre- gratitude, love), negative basic emotions (disgust, sadness, fear, frontal cortex with positive affect (Bartels and Zeki, 2004; Zahn shame) and negative social emotions (anger, guilt, contempt, et al., 2009; Vytal and Hamann, 2010). Fourth, negative emotions despair) and surprise (Figure 3). Clustering is an exploratory and such as guilt, contempt, anger, and despair clustered together, descriptive technique that does not allow strong inferences potentially reflecting their social dimension and interpersonal about underlying causal structure. Rather, clustering of neural aspects or their self-conscious nature. Especially, left hemi- similarities was used to reduce the dimensionality of the data sphere activation in orbitofrontal cortex connected to rewards for visualization purposes. Comparison of subjectively experi- and punishments (Kringelbach and Rolls, 2004), as well as in in- enced similarity of emotions and the similarity of the neural ferior frontal cortex and dorsolateral prefrontal cortex, which patterns suggested a direct link between the whole-brain neural subserve language and executive control (Poldrack et al., 1999; signatures of emotions and the corresponding subjective feel- Kane and Engle, 2002), and in anterior insula linked to process- ings: the more similar neural signatures two emotions had, the ing of social emotions (Lamm and Singer, 2010) was activated more similar they were experienced. This accords with prior during these emotions. Fifth, surprise did not resemble any of work suggesting that emotion-specific neural activation pat- the other emotions included in this study, but was instead clos- terns might explain why each emotion feels subjectively differ- est to the neutral state. This is in line with previous research ent (Damasio et al., 2000; Saarima ¨ki et al., 2016). Emotions might showing that surprise tends to separate from other emotions in constitute distinct activity patterns in regions processing differ- subjective ratings (Toivonen et al., 2012). ent emotion-related information, such as somatosensory (bod- Finally, we also found decreased activation in auditory areas ily sensations), motor (actions), as well as brainstem and and increased activation in visual areas during the imagery of thalamocortical loops (physiological arousal). Activation from all emotion categories, likely reflecting the offset of the auditory these areas is then integrated in the cortical midline, such inte- stimulation followed by mental imagery of the emotion- gration then giving rise to the interpretation of the subjective evoking situation (Ganis et al., 2004) and emotion-related modu- feeling (Northoff and Bermpohl, 2004; Northoff et al., 2006). lation of this activity (Holmes and Mathews, 2005; Thus, a subjective feeling of a specific emotion stems from the Nummenmaa et al., 2012; Kassam et al., 2013). net activation of different sub-processes, rather than solely on the basis of any single component of emotion. Limitations Organization of the affective space in the brain Despite the careful selection of the emotional stimuli and k- Our hierarchical clustering analysis and cumulative mapping means clustering suggesting clear categorical structure in the reveal how different patterns of activity may give rise to differ- evoked affect (see Figure 1), it is possible that the narratives did ent emotions. First, midline regions including ACC, PCC, and not fully capture the target emotion only, and might have eli- precuneus were activated during most emotions. These regions cited also a mixture of emotions. Yet these may (1) arise in dif- might code emotional valence (Colibazzi et al., 2010; Chikazoe ferent time points during the narratives and (2) be not as strong Downloaded from https://academic.oup.com/scan/article-abstract/13/5/471/4956228 by Ed 'DeepDyve' Gillespie user on 21 June 2018 480 | Social Cognitive and Affective Neuroscience, 2018, Vol. 13, No. 5 as the main target emotions (see Figure 1), thus average trial- References wise activations most likely pertains to the target emotion. Adolphs, R. (2002a). Recognizing emotion from facial expres- Despite this, the observed MVPA pattern may reflect whether sions: psychological and neurological mechanisms. Behavioral each narrative is dominated by one emotion, or at least show and Cognitive Neuroscience Reviews, 1, 21–62. the weighted influence of each emotion on a voxel activity. Adolphs, R. (2002b). Neural systems for recognizing emotion. Therefore, the successful classification per se shows that at least Current Opinion in Neurobiology, 12(2),169–77. the target emotions were successfully elicited, yet, better classi- Adolphs, R. (2010). What does the amygdala contribute to social fication accuracies could potentially be reached if stimuli could cognition?. Annals of the New York Academy of Sciences, 1191, target more selectively one category at the time. 42–61. Emotions were elicited using narrative-guided emotional Adolphs, R. (2017). How should neuroscience study emotions? by imagery. The narrative stimuli across different categories of distinguishing emotion states, concepts, and experiences. emotion differed along dimensions that are not of interest (for Social Cognitive and Affective Neuroscience, 12, 24–31. instance, different words or different kinds of imagery Adolphs, R., Damasio, H., Tranel, D., Cooper, G., Damasio, A.R. prompted), thus creating variation both within and between (2000). A role for somato-sensory cortices in the visual recogni- emotion categories, though the latter is unlikely to systematic- tion of emotion as revealed by three-dimensional lesion map- ally correlate with particular emotions. Moreover, any within- ping. Journal of Neuroscience, 20, 2683–90. category variation would probably work against the classifier by Anders, S., Lotze, M., Erb, M., Grodd, W., Birbaumer, N. (2004). lowering the classification accuracy. Despite this, we observed Brain activity underlying emotional valence and arousal: a re- above chance level accuracy for all emotion categories except sponse-related fMRI study. Human Brain Mapping, 23(4), 200–9. longing and shame. However, it is possible that there were dif- Baron-Cohen, S., Wheelwright, S., Hill, J., Raste, Y., Plumb, I. ferences between categories in, for instance, evoked emotional (2001). The “Reading the Mind in the Eyes” test revised version: imagery, that vary across emotion categories in some uncon- a study with normal adults, and adults with Asperger syn- trolled manner, thus potentially in part affecting the classifica- drome or high-functioning autism. Journal of Child Psychology tion accuracy. This may constitute a limitation of our study but and Psychiatry, 42(2), 241–51. is also inherently related to variations in scenarios and ap- Barrett, L.F. (2017). The theory of constructed emotion: an active praisal dimensions that constitute distinct emotion types. inference account of interoception and categorization. Social Cognitive and Affective Neuroscience, 12(11), 1833–23. Bartels, A., Zeki, S. (2004). The neural correlates of maternal and romantic love. Neuroimage, 21(3), 1155–66. Conclusions Blakemore, R.L., Rieger, S.W., Vuilleumier, P. (2016). Negative Our results characterize the distinct and distributed neural sig- emotions facilitate isometric force through activation of pre- natures of multiple emotional states. Different emotions result frontal cortex and periaqueductal gray. Neuroimage, 124(Pt A), from differential activation patterns within a shared neural cir- 627–40. cuitry, mostly consisting of midline regions, motor areas and Buckner, R.L., Carroll, D.C. (2007). Self-projection and the brain. subcortical regions. The more similar the neural underpinnings Trends in Cognitive Science, 11(2), 49–57. of these emotions, the more similarly they also are experienced. Chandrasekhar, P.V., Capra, C.M., Moore, S., Noussair, C., Berns, We suggest that the relative engagement of different parts of G.S. (2008). Neurobiological regret and rejoice functions for this system defines the current emotional state. aversive outcomes. Neuroimage, 39(3), 1472–84. Chikazoe, J., Lee, D.H., Kriegeskorte, N., Anderson, A.K. (2014). Population coding of affect across stimuli, modalities and indi- Supplementary data viduals. Nature Neuroscience, 17(8), 1114–22. Colibazzi, T., Posner, J., Wang, Z., et al. (2010). Neural systems Supplementary data are available at SCAN online. subserving valence and arousal during the experience of induced emotions. Emotion, 10(3), 377–89. Coricelli, G., Critchley, H.D., Joffily, M., O’Doherty, J.P., Sirigu, A., Acknowledgements Dolan, R.J. (2005). Regret and its avoidance: a neuroimaging study of choice behavior. Nature Neuroscience, 8(9), 1255–62. We acknowledge the computational resources provided by Costa, V.D., Lang, P.J., Sabatinelli, D., Versace, F., Bradley, M.M. the Aalto Science-IT project. We thank Marita Kattelus for (2010). Emotional imagery: assessing pleasure and arousal in her help in fMRI data collection and Matthew Hudson for a the brain’s reward circuitry. Human Brain Mapping, 31(9), language check of the manuscript. 1446–57. Critchley, H.D., Rotshtein, P., Nagai, Y., O’Doherty, J., Mathias, C.J., Dolan, R.J. (2005). Activity in the human brain predicting Funding differential heart rate responses to emotional facial expres- sions. Neuroimage, 24(3), 751–62. This study was supported by the aivoAALTO project of the Cunningham, W.A. (2013). Introduction to special section: psy- Aalto University, Academy of Finland (#265917 and #294897 chological constructivism. Emotion Review, 5(4), 333–4. to L.N., and #138145 and #276643 to I.P.J.), ERC Starting Grant Damasio, A.R. (1999). The Feeling of What Happens: Body and (#313000 to L.N.); Finnish Cultural Foundation (#00140220 to Emotion in the Making of Consciousness. San Diego: Houghton H.S.); and the Swiss National Science Foundation National Mifflin Harcourt. Center of Competence in Research for Affective Sciences Damasio, A.R., Grabowski, T.J., Bechara, A., et al. (2000). (#51NF40-104897 to P.V.). Subcortical and cortical brain activity during the feeling of Conflict of interest. None declared. self-generated emotions. Nature Neuroscience, 3(10), 1049–56. Downloaded from https://academic.oup.com/scan/article-abstract/13/5/471/4956228 by Ed 'DeepDyve' Gillespie user on 21 June 2018 H. Saarima ¨ki et al. | 481 Kober, H., Barrett, L.F., Joseph, J., Bliss-Moreau, E., Lindquist, K., Damasio, A.R., Carvalho, G.B. (2013). The nature of feelings: evo- lutionary and neurobiological origins. Nature Reviews Wager, T.D. (2008). Functional grouping and Neuroscience, 14(2), 143–52. cortical-subcortical interactions in emotion: a meta-analysis de Gelder, B., Snyder, J., Greve, D., Gerard, G., Hadjikhani, N. of neuroimaging studies. Neuroimage, 42(2), 998–1031. (2004). Fear fosters flight: a mechanism for fear contagion Kohler, E., Keysers, C., Umilta, M.A., Fogassi, L., Gallese, V., when perceiving emotion expressed by a whole body. Rizzolatti, G. (2002). Hearing sounds, understanding actions: Proceedings of the National Academy of Sciences of the USA, 101(47), action representation in mirror neurons. Science, 297(5582), 16701–6. 846–8. Edelstein, R.S., Shaver, P.R. (2007). A cross-cultural examination Kotz, S.A., Kalberlah, C., Bahlmann, J., Friederici, A.D., Haynes, of lexical studies of self-conscious emotions. In: Tracy, J.L., J.D. (2013). Predicting vocal emotion expressions from the Robins, R.W., Tangney, J.P., editors. The Self-Conscious Emotions: human brain. Human Brain Mapping, 34(8), 1971–81. Theory and Research. New York: Guilford Press. 194–208. Kragel, P.A., LaBar, K.S. (2013). Multivariate pattern classification Eklund, A., Nichols, T.E., Knutsson, H. (2016). Cluster failure: why reveals autonomic and experiential representations of discrete fMRI inferences for spatial extent have inflated false-positive emotions. Emotion, 13(4), 681–90. Kragel, P.A., LaBar, K.S. (2014). Advancing emotion theory with rates. Proceedings of the National Academy of Sciences of the USA, 113(28), 7900–5. multivariate pattern classification. Emotion Review, 6(2), 160–74. Ekman, P. (1992). An argument for basic emotions. Cognition Kragel, P.A., LaBar, K.S. (2015). Multivariate neural biomarkers of Emotion, 6(3–4), 169–200. emotional states are categorically distinct. Social Cognitive and Ekman, P. (1999). Facial expressions. In: Dalgleish, T., Power. M., Affective Neuroscience, 10(11), 1437–48. editors. Handbook of Cognition and Emotion. New York: John Kragel, P.A., LaBar, K.S. (2016). Decoding the nature of emotion in Wiley & Sons Ltd. 301–20. the brain. Trends in Cognitive Science, 20(6), 444–55. Ekman, P., Cordaro, D. (2011). What is meant by calling emotions Kreibig, S.D. (2010). Autonomic nervous system activity in emo- basic. Emotion Review, 3(4), 364–70. tion: a review. Biological Psychology, 84(3), 394–421. Eryilmaz, H., Van De Ville, D., Schwartz, S., Vuilleumier, P. (2011). Kringelbach, M.L., Rolls, E.T. (2004). The functional neuroanat- Impact of transient emotions on functional connectivity dur- omy of the human orbitofrontal cortex: evidence from neuroi- ing subsequent resting state: a wavelet correlation approach. maging and neuropsychology. Progress in Neurobiology, 72(5), Neuroimage, 54(3), 2481–91. 341–72. Ethofer, T., Van De Ville, D., Scherer, K., Vuilleumier, P. (2009). Lamm, C., Singer, T. (2010). The role of anterior insular cortex in Decoding of emotional information in voice-sensitive cortices. social emotions. Brain Structure and Function, 214(5–6), 579–91. Current Biology, 19(12), 1028–33. Lewis, M.D., Liu, Z.X. (2011). Three time scales of neural Fischer, A., LaFrance, M. (2015). What drives the smile and the self-organization underlying basic and nonbasic emotions. tear: why women are more emotionally expressive than men. Emotion Review, 3(4), 416–23. Emotion Review, 7(1), 22–9. Lindquist, K.A., Wager, T.D., Kober, H., Bliss-Moreau, E., Barrett, Frijda, N.H., Kuipers, P., Ter Schure, E. (1989). Relations among L.F. (2012). The brain basis of emotion: a meta-analytic review. emotion, appraisal, and emotional action readiness. Journal of Behavioral and Brain Science, 35(3), 121–43. Personality and Social Psychology, 57(2), 212–28. Linnman, C., Moulton, E.A., Barmettler, G., Becerra, L., Borsook, Ganis, G., Thompson, W.L., Kosslyn, S.M. (2004). Brain areas D. (2012). Neuroimaging of the periaqueductal gray: state of underlying visual mental imagery and visual perception: an the field. Neuroimage, 60(1), 505–22. fMRI study. Cognitive Brain Research, 20(2), 226–41. Mazzola, V., Vuilleumier, P., Latorre, V., et al. (2013). Effects of Grossman, M., Wood, W. (1993). Sex differences in intensity of emotional contexts on cerebello-thalamo-cortical activity dur- emotional experience: a social role interpretation. Journal of ing action observation. PLoS ONE, 8(9), e75912. Personality and Social Psychology, 65(5), 1010–22. Meaux, E., Vuilleumier, P. 2015. Emotion perception and elicit- Hamann, S. (2012). Mapping discrete and dimensional emotions ation. In: Toga, A.W., editor. Brain Mapping: An Encyclopedic onto the brain: controversies and consensus. Trends in Reference. Oxford (UK), Elsevier. Cognitive Science, 16(9), 458–66. Northoff, G., Bermpohl, F. (2004). Cortical midline structures and Hofer, A., Siedentopf, C.M., Ischebeck, A., et al. (2006). Gender dif- the self. Trends in Cognitive Science, 8(3), 102–7. ferences in regional cerebral activity during the perception of Northoff, G., Heinzel, A., de Greck, M., Bermpohl, F., Dobrowolny, emotion: a functional MRI study. Neuroimage, 32(2), 854–62. H., Panksepp, J. (2006). Self-referential processing in our Holmes, E., Mathews, A. (2005). Mental imagery and emotion: a brain—a meta-analysis of imaging studies on the self. special relationship?. Emotion, 5(4), 489–97. Neuroimage, 31(1), 440–57. Jenkinson, M., Bannister, P.R., Brady, J.M., Smith, S.M. (2002). Nummenmaa, L., Hirvonen, J., Parkkola, R., Hietanen, J.K. (2008). Improved optimization for the robust and accurate linear Is emotional contagion special? An fMRI study on neural sys- registration and motion correction of brain images. tems for affective and cognitive empathy. Neuroimage, 43(3), Neuroimage, 17(2), 825–41. 571–80. Jenkinson, M., Beckmann, C.F., Behrens, T.E.J., Woolrich, M.W., Nummenmaa, L., Glerean, E., Viinikainen, M., Jaaskelainen, I.P., Smith, S.M. (2012). FSL. Neuroimage, 62(2), 782–90. Hari, R., Sams, M. (2012). Emotions promote social interaction Kane, M.J., Engle, R.W. (2002). The role of prefrontal cortex in by synchronizing brain activity across individuals. Proceedings working-memory capacity, executive attention, and general of the National Academy of Sciences of USA, 109(24), 9599–604. fluid intelligence: an individual-differences perspective. Nummenmaa, L., Glerean, E., Hari, R., Hietanen, J.K. (2014a). Psychon B Review, 9(4), 637–71. Bodily maps of emotions. Proceedings of the National Academy of Kassam, K.S., Markey, A.R., Cherkassky, V.L., Loewenstein, G., Sciences of the USA, 111(2), 646–51. Just, M.A. (2013). Identifying emotions on the basis of neural Nummenmaa, L., Saarima ¨ ki, H., Glerean, E., Gotsopoulos, A., Hari, R., Sams, M. (2014b). Emotional speech synchronizes activation. PLoS One, 8(6), e66032. Downloaded from https://academic.oup.com/scan/article-abstract/13/5/471/4956228 by Ed 'DeepDyve' Gillespie user on 21 June 2018 482 | Social Cognitive and Affective Neuroscience, 2018, Vol. 13, No. 5 and prefrontal cortex on recognizing facial expressions of brains across listeners and engages large-scale dynamic brain networks. Neuroimage, 102, 498–509. complex emotions. Journal of Cognitive Neuroscience, 17(9), Nummenmaa, L., Saarima ¨ ki, H. (2017). Emotions as discrete pat- 1410–9. terns of systemic activity. Neuroscience Letters, doi: 10.1016/ Simon-Thomas, E.R., Godzik, J., Castle, E., et al. (2012). An j.neulet.2017.07.012. fMRI study of caring vs self-focus during induced compassion Panksepp, J. (1982). Toward a general psychobiological theory of and pride. Social Cognitive and Affective Neuroscience, 7(6), emotions. Behavioral and Brain Science, 5(03), 407–22. 635–48. Panksepp, J., Watt, D. (2011). What is basic about basic emotions? Smith, S.M. (2002). Fast robust automated brain extraction. Lasting lessons from affective neuroscience. Emotion Review, Human Brain Mapping, 17(3), 143–55. 3(4), 387–96. Takahashi, H., Matsuura, M., Koeda, M., et al. (2008). Brain activa- Peelen, M.V., Atkinson, A.P., Vuilleumier, P. (2010). Supramodal tions during judgments of positive self-conscious emotion and representations of perceived emotions in the human brain. positive basic emotion: pride and joy. Cerebral Cortex, 18(4), Journal of Neuroscience, 30(30), 10127–34. 898–903. Pichon, S., de Gelder, B., Grezes, J. (2008). Emotional modulation Toivonen, R., Kivela ¨ , M., Sarama ¨ ki, J., Viinikainen, M., Vanhatalo, of visual and motor areas by dynamic body expressions of M., Sams, M. (2012). Networks of emotion concepts. PLoS One, anger. Society for Neuroscience, 3(3–4), 199–212. 7(1), e28883. Poldrack, R.A., Wagner, A.D., Prull, M.W., Desmond, J.E., Glover, Trost, W., Ethofer, T., Zentner, M., Vuilleumier, P. (2012). G.H., Gabrieli, J.D. (1999). Functional specialization for seman- Mapping aesthetic musical emotions in the brain. Cerebral tic and phonological processing in the left inferior prefrontal Cortex, 22(12), 2769–83. cortex. Neuroimage, 10(1), 15–35. Vuilleumier, P., Trost, W. (2015). Music and emotions: from en- Polyn, S.M., Natu, V.S., Cohen, J.D., Norman, K.A. (2005). chantment to entrainment. Annals of the New York Academy of Category-specific cortical activity precedes retrieval during Sciences, 1337, 212–22. memory search. Science, 310(5756), 1963–6. Vytal, K., Hamann, S. (2010). Neuroimaging support for discrete Reyes-Vargas, M., Sa ´ nchez-Gutie ´ rrez, M., Fuginer, L., et al. (2013). neural correlates of basic emotions: a voxel-based meta-ana- Hierarchical clustering and classification of emotions in lysis. Journal of Cognitive Neuroscience, 22(12), 2864–85. human speech using confusion matrices. In: International Wagner, U., N’Diaye, K., Ethofer, T., Vuilleumier, P. (2011). Conference on Speech and Computer. Springer International Guilt-specific processing in the prefrontal cortex. Cerebral Publishing. 162–9. Cortex, 21(11), 2461–70. Rosvall, M., Bergstrom, C.T. (2010). Mapping change in large net- Wang, S., Tudusciuc, O., Mamelak, A.N., Ross, I.B., Adolphs, R., works. PLoS One, 5(1), e8694. Rutishauser, U. (2014). Neurons in the human amygdala select- Russell, J.A. (2003). Core affect and the psychological construc- ive for perceived emotion. Proceedings of the National Academy of tion of emotions. Psychological Review, 110(1), 145–72. Sciences of the USA, 111(30), E3110–9. Saarima ¨ ki, H., Gotsopoulos, A., Ja ¨a ¨ skela ¨ inen, I.P., et al. (2016). Wicker, B., Keysers, C., Plailly, J., Royet, J.P., Gallese, V., Discrete neural signatures of basic emotions. Cerebral Cortex, Rizzolatti, G. (2003). Both of us disgusted in my insula: the 26(6), 2563–73. common neural basis of seeing and feeling disgust. Neuron, Said, C.P., Moore, C.D., Engell, A.D., Todorov, A., Haxby, J.V., 40(3), 655–64. (2010). Distributed representations of dynamic facial expres- Zahn, R., Moll, J., Paiva, M., et al. (2009). The neural basis of sions in the superior temporal sulcus. Journal of Vision, 1, 11. human social values: evidence from functional MRI. Cerebral Schutter, D.J.L.G., van Honk, J. (2009). The cerebellum in emotion Cortex, 19(2), 276–83. regulation: a repetitive transcranial magnetic stimulation Zhang, Y., Brady, M., Smith, S. (2001). Segmentation of brain MR study. Cerebellum, 8(1), 28–34. images through a hidden Markov random field model and the Shaw, P., Bramham, J., Lawrence, E.J., Morris, R., Baron-Cohen, S., expectation-maximization algorithm. IEEE Transactions of the David, S. (2005). Differential effects of lesions of the amygdala Medical Imaging, 20(1), 45–57. Downloaded from https://academic.oup.com/scan/article-abstract/13/5/471/4956228 by Ed 'DeepDyve' Gillespie user on 21 June 2018 http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Social Cognitive and Affective Neuroscience Oxford University Press

Distributed affective space represents multiple emotion categories across the human brain

Loading next page...
 
/lp/ou_press/distributed-affective-space-represents-multiple-emotion-categories-m3Q3Iwd3Yh

References (103)

Publisher
Oxford University Press
Copyright
© The Author(s) (2018). Published by Oxford University Press.
ISSN
1749-5016
eISSN
1749-5024
DOI
10.1093/scan/nsy018
Publisher site
See Article on Publisher Site

Abstract

The functional organization of human emotion systems as well as their neuroanatomical basis and segregation in the brain remains unresolved. Here, we used pattern classification and hierarchical clustering to characterize the organization of a wide array of emotion categories in the human brain. We induced 14 emotions (6 ‘basic’, e.g. fear and anger; and 8 ‘non- basic’, e.g. shame and gratitude) and a neutral state using guided mental imagery while participants’ brain activity was measured with functional magnetic resonance imaging (fMRI). Twelve out of 14 emotions could be reliably classified from the haemodynamic signals. All emotions engaged a multitude of brain areas, primarily in midline cortices including anterior and posterior cingulate gyri and precuneus, in subcortical regions, and in motor regions including cerebellum and premotor cortex. Similarity of subjective emotional experiences was associated with similarity of the corresponding neural activation patterns. We conclude that different basic and non-basic emotions have distinguishable neural bases characterized by specific, distributed activation patterns in widespread cortical and subcortical circuits. Regionally differentiated engagement of these circuits defines the unique neural activity pattern and the corresponding subjective feeling associated with each emotion. Key words: emotion; fMRI; MVPA; pattern classification Introduction Adolphs, 2017; Barrett, 2017). This discussion revolves around the The organization of human emotion systems is currently a topic number of distinct emotion systems and the organization of of lively debate (Hamann, 2012; Lindquist et al., 2012; Kragel and human emotion circuits in the brain. Most research on specific LaBar, 2014; Meaux and Vuilleumier, 2015; Saarima ¨ki et al., 2016; emotion categories has focused on ‘primary’ or ‘basic’ emotions Received: 8 June 2017; Revised: 25 February 2018; Accepted: 28 March 2018 V C The Author(s) (2018). Published by Oxford University Press. This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/ licenses/by-nc/4.0/), which permits non-commercial re-use, distribution, and reproduction in any medium, provided the original work is properly cited. For commercial re-use, please contact journals.permissions@oup.com Downloaded from https://academic.oup.com/scan/article-abstract/13/5/471/4956228 by Ed 'DeepDyve' Gillespie user on 21 June 2018 472 | Social Cognitive and Affective Neuroscience, 2018, Vol. 13, No. 5 (usually anger, fear, disgust, happiness, sadness and surprise). unclear whether similar, distinct circuits as previously observed According to the basic emotion theories, emotions have been for basic emotions would also support these types of emotions. shaped during the evolution to serve distinct survival functions Here, we investigated the neural underpinnings of multiple via distinct neural circuits and physiological systems (Panksepp, basic and non-basic emotion categories. We induced 14 emo- 1982; Ekman, 1992, 1999; Damasio, 1999). In turn, constructivist tions in participants by guided mental imagery while their brain theories consider specific emotions emerging from the interaction activity was measured with fMRI. First, to examine whether dif- of fundamental processes shared across psychological domains ferent emotions have distinguishable brain bases, we employed and, thus, not unique to emotions alone (see, e.g. Russell, 2003; pattern classification. Second, using hierarchical clustering, we Lewis and Liu, 2011; Hamann, 2012; Cunningham, 2013). Basic examined the similarity of neural substrates of different emo- emotion theories emphasize similarities in emotion mechanisms tions and tested how this similarity was related to how simi- across individuals, while constructivist theories emphasize inter- larly these emotions are experienced. Third, we mapped and within-individual variability. Several human neuroimaging representation of different emotions in the core emotion pro- studies support the view that at least the canonical basic emo- cessing regions of the brain, by characterizing the emotion- tions have distinguishable neural bases, as they are associated dependent neural response profiles using univariate analyses with discernible neural activity patterns as measured by BOLD- and cumulative activation mapping. functional magnetic resonance imaging (fMRI) (e.g. Kragel and LaBar 2014; Saarima ¨ki et al., 2016; for a review, see Kragel and LaBar, 2016; Nummenmaa & Saarima ¨ ki, 2017). Materials and methods Yet, a wide array of other emotions, including ‘secondary’ or Participants ‘social’ emotions (see reviews and proposed taxonomies in Damasio, 1999; Adolphs, 2002a), also serve adaptive survival Twenty-five female volunteers (aged 19–38, mean age functions and are characterized by distinctive facial expressions 23.6 years) participated in the experiment. All were right- (Baron-Cohen et al., 2001; Shaw et al., 2005), bodily sensations handed, neurologically healthy and with normal or corrected- (Nummenmaa et al., 2014a), and neural activity patterns (Kassam to-normal vision, and gave written informed consent according et al. 2013; Kragel and LaBar, 2015). Nevertheless, the psycho- to the Declaration of Helsinki. The Institutional Review Board of logical and neural mechanisms of these non-basic emotions, as Aalto University approved the experimental protocol. Female well as their commonalities or differences relative to basic emo- participants were chosen to maximize the power of the experi- tions, remain largely unresolved (Ekman, 1999; Ekman and ment, as when compared to males, females typically experience Cordaro, 2011; Adolphs, 2002b). These emotions may involve and portray more intensive emotional reactions (see review in more elaborate cognitive representations acquired through ex- Fischer and LaFrance, 2015), and show greater brain responses perience, education, and social norms (Panksepp and Watt, 2011), during emotions (e.g., Hofer et al., 2006) and stronger facial mim- and hence, recruit brain systems partly distinct from those impli- icry as indexed by EMG (Grossman and Wood, 1993). cated in more ‘primitive’ and possibly innate basic emotions. It is thus possible that also non-basic emotions may have distinguish- Stimuli able neural bases which would, however, be discernible from that of basic emotions. Accordingly, the specific number of basic emo- The stimuli for the guided affective imagery were sixty 5–20-s tions has been debated and the distinction between basic and long narratives describing an antecedent event triggering prom- non-basic emotions has been questioned (e.g. Russell, 2003). For inently one emotional state. Narrative-based guided imagery is example, Adolphs (2002b) proposed that affective processes could known to be an efficient emotion induction technique that en- be understood at various taxonomies depending on the level of gages reliably the brain’s emotion circuits (Costa et al., 2010; analysis. The basic emotion conceptualization would be particu- Nummenmaa et al., 2014b) and results in strong subjective emo- larly relevant for primary-process core affects governed by phylo- tional feelings. Each narrative elicited primarily one out of pos- genetically old emotion circuits especially within the subcortical sible 14 emotions, or a neutral emotional state. Targeted structures (Panksepp and Watt, 2011). emotions included six basic or primary emotions (anger, fear, A set of core emotion processing regions is consistently disgust, happiness, sadness and surprise) and eight social or engaged during multiple emotions. These include cortical mid- secondary emotions (shame, pride, longing, guilt, love, con- line regions (Peelen et al., 2010; Chikazoe et al., 2014; Trost et al., tempt, gratitude and despair). The narratives included a short 2012), somatomotor regions (Adolphs et al., 2000; de Gelder et al., description of a situation that participants were instructed to 2004; Nummenmaa et al., 2008, 2012; Pichon et al., 2008), as well imagine would happen to them, for instance, ‘It is a late night on as amygdala, brainstem and thalamus (Adolphs, 2010; Damasio a dimly-lit parking lot. Your car stands alone in a dark corner. and Carvalho, 2013; Kragel and LaBar, 2014). These regions serve Suddenly you hear a gun shot behind you.’ (fear), or ‘Your lover is as candidate areas containing distinct neural signatures for dif- lying next to you on a bed. You look into his eyes when he gently ferent basic emotions (Peelen et al., 2010; Saarima ¨ki et al., 2016). touches your hair and bends to kiss your lips.’ (love). Based on an Yet, it is currently unclear whether these regions also code for online pilot experiment (see Supplementary Material), we se- other, non-basic emotions. Prior studies using univariate ana- lected four narratives per category (total of 60 narratives; see lyses have quantified neural responses to emotions such as re- Supplementary Table S1 for all narrative stimuli) to be included gret (Coricelli et al., 2005; Eryilmaz et al., 2011), guilt (Wagner in the fMRI experiment. et al., 2011), pride (Takahashi et al., 2008; Zahn et al., 2009; The selected narratives were spoken by a female speaker Simon-Thomas et al., 2012), rejoice (Chandrasekhar et al., 2008) using neutral prosody without cues for the affective content of and maternal love (Bartels and Zeki, 2004), as well as aesthetic the narrative. The background noise in the recording room was feelings such as wonder or nostalgia (Vuilleumier and Trost, recorded and equalized (bandpass filter 50–10 000 Hz) with 2015). Yet, these studies (except Kassam et al., 2013; Kragel and Apple Logic Pro 9 (Apple Inc.), and gate and compressor were LaBar 2014) have usually compared brain activation differences used to attenuate the background noise during moments of si- between two emotions in a univariate fashion. Thus, it remains lence and slightly compress the voice dynamic to limit the Downloaded from https://academic.oup.com/scan/article-abstract/13/5/471/4956228 by Ed 'DeepDyve' Gillespie user on 21 June 2018 H. Saarima ¨ki et al. | 473 variance of the sound power. The loudness of each narrative et al., 2002) and non-brain matter was removed using BET was normalized according to the peak value. (Smith, 2002). High-pass temporal filtering was applied using The recorded narratives were divided into 4 runs of 15 narra- Gaussian-weighted least-squares straight line fitting with sigma tives, with one narrative per category in each run. The runs of 100 s. Participant-wise gray matter masks were generated by contained the same narratives for all participants but the pres- segmenting the T1-weighted images into gray and white matter entation order was randomized within each run and for each and cerebrospinal fluid using the FAST segmentation tool participant. During fMRI, the four sets of recorded narratives (Zhang et al., 2001) and transforming the masks to the native were all presented twice thus resulting in altogether eight runs. space using FLIRT (Jenkinson et al., 2002) with nine degrees of Each run lasted 7–8 min and consisted of 15 trials. A trial started freedom. The gray matter maps were subsequently thresholded with a fixation cross shown for 0.5 s, followed by a 2-s presenta- using intensity threshold> 0.5 to create participant-specific tion of a word describing the target emotion (anger, fear, disgust masks. This threshold was chosen to include those voxels with and so forth) to prepare participants for the forthcoming emo- a higher probability of belonging to the gray matter and, subse- tional imagery task and thus to make the induction more quently, the masks were visually inspected to make sure most powerful. Next, the narrative was spoken out, followed by a 10-s gray matter was included. On average, the gray matter mask imagery period. The trial ended with a 10-s wash-out period to included 16 000 voxels. counter for possible carryover effects. Participants were in- For univariate general linear model (GLM) analysis, the pre- structed to try to get involved in the narratives by imagining the processed functional data were registered to 2-mm Montreal described events as happening to themselves and to experience Neurological Institute (MNI) 152 standard space template using the corresponding feeling as vividly as possible. FLIRT (Jenkinson et al. 2002) and 9 degrees of freedom, and Auditory stimuli were delivered through Sensimetrics S14 smoothed using a Gaussian kernel with FMWH 8. insert earphones (Sensimetrics Corporation, Malden, MA, USA). Sound was adjusted for each participant to be loud enough to be Multivariate pattern classification within participants heard over the scanner noise. The visual stimuli were delivered using Presentation software (Neurobehavioral Systems Inc., The classification of emotion categories was performed using Albany, CA, USA), and they were back-projected on a semitrans- the Princeton multi-voxel pattern analysis (MVPA) toolbox parent screen using a 3-micromirror data projector (Christie X3, (https://pni.princeton.edu/pni-software-tools/mvpa-toolbox) Christie Digital Systems Ltd., Mo ¨ nchengladbach, Germany), and in Matlab 2012b using each participant’s data in native space. from there via a mirror to the participant. A separate classifier was trained for each participant and, after After the scanning, participants were presented with the nar- all steps, the classification results were averaged across the par- ratives again. For each narrative, the participants rated the felt ticipants. After preprocessing, voxels outside gray matter were intensity of each of the 14 emotions (plus neutral state) using a masked out using the participant-specific gray matter masks continuos scale arbitrarily ranging from 0 to 1. (for details, see and the functional data from each run were standardized to Supplementary Material). We ran k-means clustering on the re- have a mean of zero and variance of one. Next, each partici- sulting intensity profiles (i.e. intensity ratings per category for pant’s data were divided into training (N1 runs) and testing each narrative) to test whether the 15 target categories could be sets (the remaining run). Feature selection was performed using identified from the emotion-wise intensity ratings, thus reveal- one-way ANOVA (testing for the main effect of emotion cat- ing whether or not the narratives elicited distinct categorical egory) for the training set to select the voxels with a significant emotions. Also, to test the similarity of narratives belonging to (P< 0.05) main effect for emotion, i.e. to select the voxels whose the same category, we calculated the Euclidean distance be- mean activation differed between at least some of the 15 pos- tween the intensity profiles of each pair of narratives. sible emotion conditions. The feature selection preserved on Finally, participants rated the similarity of all possible pairs of average (across cross-validation folds and participants) 41% of the emotions induced during the experiment (a total of 105 pairs) the voxels. Hemodynamic lag was corrected for by convolving using a direct comparison method. The participants were shown the boxcar category regressors with the canonical double one pair of emotion words at the time and asked to rate how sub- gamma HRF function and thresholding the convolved regres- jectively similar they experience the emotions (ranging from no sors using a sigmoid function to return the regressors to the bin- similarity [0] to full similarity [1]). Based on the ratings, we ex- ary form. The classification was performed on all the tracted the recollected experiential similarity matrix for each par- standardized, HRF-convolved fMRI volumes from the 10 s im- ticipant individually and averaged these across all participants. agery period following the narrative (treating all single time points per category as samples for that category; median 6 vol- umes per one emotion category in one run) to extract only emo- MRI data acquisition and preprocessing tion-related brain activity, and to minimize activity related to MRI data were collected on a 3-T Siemens Magnetom Skyra scan- the acoustic and semantic features of the stimuli. Thus, each of ner at the Advanced Magnetic Imaging center, Aalto University, the eight runs included on average 90 TRs (5–6 TRs per category) using a 20-channel Siemens volume coil. Whole-brain functional that were used in the classification. A linear neural network scans were collected using a whole brain T2*-weighted EPI se- classifier without hidden layers was trained to recognize the quence with the following parameters: 33 axial slices, TR¼ 1.7 s, correct emotion category out of 15 possible ones (multiclass TE¼ 24 ms, flip angle¼ 70 , voxel size¼ 3.13.14.0 mm , matrix classification, see Polyn et al. 2005 for details). Naı ¨ve chance size¼ 646433, FOV¼ 256256 mm. A custom-modified bipolar level, derived as a ratio of 1 over the number of categories, was water excitation radio frequency (RF) pulse was used to avoid 6.7%. A leave-one-run-out cross-validation was performed, signal from fat. High-resolution anatomical images with iso- thus, dividing the data into all possible N1 run combinations tropic 111mm voxel size were collected using a T1-weighted and repeating the classification pipeline for each such cross- MP-RAGE sequence. validation fold, and the participant-wise classification accuracy Data were preprocessed using FSL 5.0 (Jenkinson et al., 2012). was calculated as an average percentage of correct guesses Motion correction was performed using MCFLIRT (Jenkinson across all the cross-validation folds. To test whether Downloaded from https://academic.oup.com/scan/article-abstract/13/5/471/4956228 by Ed 'DeepDyve' Gillespie user on 21 June 2018 474 | Social Cognitive and Affective Neuroscience, 2018, Vol. 13, No. 5 classification accuracy exceeded chance level, we used permu- each of the 14 emotions against the neutral baseline. First level tation tests to simulate the probability distribution of the classi- analysis was performed in SPM 12 (wwwl.fil.ion.ucl.ac.uk/spm/) fication by shuffling the category labels of the training set to obtain individual contrast maps and second level analysis (across training set runs) and re-running the whole classifica- was then performed with FSL randomize with the threshold free tion pipeline, repeated 1000 times for each participant. FDR cor- cluster enhancement option as it implements the least biased rection at P < 0.05 was used for multiple comparisons. way to control for multiple comparisons (Eklund et al. 2016; N¼ 5000 permutations). Emotion-wise t maps were then quali- tatively summarized across emotions as a cumulative map Hierarchical clustering where each voxel shows the number of statistically significant emotions, at the cluster corrected level of P< 0.05. We next investigated the similarities between emotions using the category confusions from the whole-brain classification. Clustering was performed for exploratory purposes to charac- Visualization of emotion clusters in the brain terize the similarities in neural and recollected experiential Finally, to summarize and visualize where emotion-wise activa- data, this however does not provide statistical evidence for the tion patterns were located, we mapped the three principal clusters similarity structures. Nevertheless, similarity between neural obtained with hierarchical clustering on the cortical and subcor- and experiential data would suggest similar representational tical maps using R, G, B color space. The last cluster containing sur- segregation in the neural code and subjective feelings. From the prise and neutral was plotted separately (see Supplementary group-averaged confusion matrix, we calculated a distance ma- Figure S6) given it contained neutral-like states only. For each emo- trix by taking the category confusion vectors for each pair of tion, we took the unthresholded second level t maps obtained emotions and by calculating the Euclidean distance between from the GLM analysis, summed them for emotions belonging to these vectors (see Reyes-Vargas et al., 2013). We then employed the same cluster, and assigned the summed values to the corres- hierarchical cluster analysis in Matlab to investigate how differ- ponding R, G, B channels. The color channels were subsequently ent emotions cluster together based on their neural similarities. visualized in MNI space. Consequently, the RGB color at each voxel The agglomerative hierarchical cluster tree was calculated on reflects the cluster distribution of that voxel, and can be used for the distance matrix using linkage function with ‘complete’ option localizing brain regions contributing to different emotions. (i.e. the furthest distance method). Finally, we constructed the clusters from the cluster tree (cluster function) and chose the solution that minimized the number of categories while keep- Results ing at least two emotions per category. Note that the clustering Behavioral results was selected solely for data visualization rather than for statis- tical inference. To visualize the similarities in subjective and Behavioral ratings showed that the narratives successfully eli- neural organization of emotions, we extracted the clusters in cited reliable and strong target emotions (intensity profiles in both neural and recollected experiential data, and subsequently Figure 1A; mean intensities per target category: pride 0.80, long- plotted the cluster solutions using alluvial diagrams (Rosvall ing for 0.77, happiness 0.84, gratitude 0.84, love 0.90, surprise 0.76, and Bergstrom, 2010; www.mapequation.org). neutral 0.88, disgust 0.85, sadness 0.96, fear 0.89, shame 0.75, We then investigated to which extent the neural similarities anger 0.68, guilt 0.91, contempt 0.66, despair 0.87). In k-means between different emotional states correspond to their recol- clustering, the accuracy to assign a narrative to the correct target lected experiential (subjectively felt) differences. Experiential category based on its intensity profile was 97% (against the similarity matrices were calculated based on pairwise similarity chance level 6.7%). Also, the narratives within each category had ratings of emotions and averaged over the participants. highly similar intensity profiles (Figure 1B); i.e. narratives belong- Subsequently, the mean neural and experiential similarity ing to the same category elicited similar emotions. matrices were correlated using Spearman’s rank correlation co- efficient. The P level for the Spearman test was obtained with a Classification of basic and non-basic emotions permutation test by shuffling the neural matrix and re- calculating the correlation for 5000 times. Mean classification accuracy across the 14 emotions and the Finally, we tested whether basic and non-basic emotions neutral state was 17% (against naı ¨ve chance level of 6.7%; 95th generally differ in how categorical their experience is recog- percentile of the permuted classification accuracy distribution nized. To do so, we calculated prototypical experience scores for was 8.4%). After correcting for multiple comparisons, the classi- each emotion, defined as the sum of off-diagonal elements in fication performance was above a permutation-based signifi- the experiential and neural similarity matrices separately. This cance level for all emotions except shame and longing (P< 0.05, analysis was based on the assumption that the off-diagonal Figure 2; see Supplementary Table S2 for effect sizes and elements represent confusions across emotions, thus indicating Supplementary Figure S1 for a confusion matrix). On average, a lack of sharp categorical representation within a group of basic emotions (anger, disgust, fear, happiness, sadness, sur- emotions. In other words, we tested whether emotion confu- prise) could be classified more accurately than the non-basic sions were systematically different for basic and non-basic emotions (26% vs 15%, respectively, t(24) ¼ 7.39, P< 0.0001, types, in both subjective and neural data. The resulting ‘prototy- Cohen’s h¼ 0.16), while we found no significant differences in picality’ scores (i.e. off-diagonal scores) for basic and non-basic classifier accuracies between positive and negative emotions emotions were then compared using Mann–Whitney U test. (17% vs 14%, respectively, t(24) ¼ 1.52, P¼ 0.075). We also trained separate classifiers for each a priori selected region-of-interest (see Supplementary Methods) which showed that classification Regional effects in GLM accuracies were above chance level in frontal ROIs, especially in To investigate the overall effect of any emotion on brain activ- frontal pole, and in somatomotor ROIs, especially for pre- and ity, we first ran GLM to compare all emotions together against post-central gyri, yet did not exceed that of the whole-brain the neutral baseline, and then ran separate GLMs to compare classification (Supplementary Figure S2). Downloaded from https://academic.oup.com/scan/article-abstract/13/5/471/4956228 by Ed 'DeepDyve' Gillespie user on 21 June 2018 H. Saarima ¨ki et al. | 475 We also ran separate classifiers across positively and nega- tively valenced emotion categories to test whether classification within a similarly valenced superordinate category is compar- able to that of classification with all categories. Average classi- fier accuracy for positive emotions was 30.5% (naı ¨ve chance level 20%) and all emotions could be classified with above- chance level accuracy: love (36.9%), happiness (31.5%), gratitude (30.8%), pride (28.5%), and longing (24.9%). Average classifier ac- curacy for negative emotions was 22.4% (naı ¨ve chance level 12.5%) and above chance level accuracy was observed for each emotion: sadness (28.8%), disgust (27.8%), despair (25.7%), fear (25.1%), anger (20.5%), contempt (19.0%), and shame (15.9%). Similarity in neural basis corresponds to experienced similarity Organization of recollected experiential similarity matrices derived from behavioral ratings was significantly associated with the neural similarity matrices derived from confusion matrices from whole-brain classification (r¼ 0.37, P¼ 0.0048; Figure 3A). Clustering of confusion matrices divided the emo- tions into four clusters (Figure 3A): (1) happiness, pride, grati- tude, love, and longing; (2) surprise and neutral; (3) disgust, sadness, fear, and shame; and (4) anger, contempt, guilt, and despair, which mostly corresponded to the four main clusters of experiential similarities (Figure 3B): (1) happiness, pride, grati- tude, and love; (2) surprise and neutral; (3) longing and sadness; and (4) disgust, anger, contempt, shame, guilt, fear, and despair. To test whether basic and non-basic emotions generally dif- fer in how prototypical their experience is, we calculated a prototypical experience score for each emotion (i.e. rate of con- fusions with other emotions) and then compared the average prototypicality scores for basic and non-basic emotions. There were no differences between basic and non-basic in either the neural or experiential data. Fig. 1. The stimuli consisted of 60 brief (5–20 s) narratives that induced 14 emo- tional states and a neutral state. (A) Participants rated on a scale from 0 to 1 how strongly each emotion was elicited by the narrative. The coloring indexes the Affective space in the brain mean intensity for experiencing each emotion for each narrative. (B) Based on the emotion intensity ratings, we calculated the similarity of emotional experi- To investigate the brain regions generally activated and deacti- ence between narratives by using Euclidean distances. vated by our emotion stimuli, we contrasted the brain activity Fig. 2. Means and standard errors for emotion-wise whole-brain classification accuracy. Dashed line represents chance level (6.7%). Colors reflect the clusters formed on the basis of experienced similarity of emotions (see Figure 3). Downloaded from https://academic.oup.com/scan/article-abstract/13/5/471/4956228 by Ed 'DeepDyve' Gillespie user on 21 June 2018 476 | Social Cognitive and Affective Neuroscience, 2018, Vol. 13, No. 5 Fig. 3. (A) Left: Neural similarity matrix extracted from the classifier confusion matrix. The similarity matrix was created by calculating the Euclidean distance between each pair of emotions based on their category confusion vectors. Right: Experiential similarity matrix based on pairwise similarity ratings for emotions elicited by the narratives. (B) Alluvial diagram showing the similarity of hierarchical cluster structure of the experiential and neural similarities. Coloring of the emotion categories is based on the clusters in the neural similarity matrix. related to all emotions with the neutral condition (Figure 4 out- cumulative deactivation map (Figure 4B) showed that most lines; see also Supplementary Figure S4). The areas activated by emotions involved deactivation of auditory cortex, frontal areas emotion in general included pre-motor cortex, thalamus, insula including superior frontal gyri, right middle frontal gyrus, and and putamen. Deactivations were observed in visual and audi- left inferior frontal gyrus and parietal areas including supramar- tory cortices, precuneus, PCC, right anterior PFC and right lateral ginal gyrus. parietal areas. Finally, to visualize where specific emotions are encoded in To reveal the brain regions contributing most consistently to the brain, we mapped the clusters resulting from the hierarch- different emotions, we constructed cumulative activation and ical clustering on cortical and subcortical surfaces (Figure 5). deactivation maps of emotion-driven hemodynamic responses All emotions activated areas in the visual cortex, ACC, right (Figure 4; corresponding effect size maps in Supplementary temporal pole, supplementary motor area and subcortical re- Figure S4; the emotion-wise statistical t maps are available in gions. In addition, positive emotions belonging to Cluster 1 http://neurovault.org/collections/TWZSVODU). These maps re- (happiness, pride, gratitude, love, longing) were more promin- veal how many of the 14 possible emotional states activated ent in anterior frontal areas including vmPFC. Negative basic each voxel. The cumulative activation map (Figure 4A) showed emotions from Cluster 2 (disgust, sadness, fear, shame) acti- that most emotions involved activation of midline regions vated especially insula, supplementary motor area and specific including anterior cingulate cortex (ACC) and precuneus, as well parts of subcortical structures. Negative social emotions belong- as subcortical regions including brain stem and hippocampus, ing to Cluster 3 (anger, contempt, guilt, despair; see motor areas including cerebellum, and visual cortex. The Supplementary Table S1 for the story stimuli targeting different Downloaded from https://academic.oup.com/scan/article-abstract/13/5/471/4956228 by Ed 'DeepDyve' Gillespie user on 21 June 2018 H. Saarima ¨ki et al. | 477 Fig. 4. (A) Cumulative activation map showing the cumulative sum of binarized t maps (P< 0.05, cluster-corrected) across each emotion vs neutral condition. Outline shows the GLM results for all emotions contrasted against the neutral condition (P< 0.05, cluster-corrected). (B) Cumulative deactivation map showing the cumulative sum of binarized t maps (P< 0.05, cluster-corrected) across neutral vs each emotion. Outline shows the GLM results for the neutral condition contrasted against all emotions (P< 0.05, cluster-corrected). emotions) were most prominent in left insula and the adjacent evidenced by the correspondence between neural and recol- frontal areas. Finally, surprise (Cluster 4, Supplementary Figure lected experiential similarity between emotions. S6) activated especially parts of auditory cortex, supplementary motor areas, and left insula. Different emotions are characterized by distinct neural signatures Discussion Altogether 12 emotions (excluding longing for and shame) out Our results reveal that multiple emotion states have distinct of the 14 included in the study could be reliably classified from and distributed neural bases, as evidenced by the above chance- the fMRI signals. Our results extend previous studies, which level classifier performance for all emotions, except longing and have shown classification of specific emotional states usually shame. Together with the spatial location of emotion- focusing on the basic emotions only or a subset of these dependent brain activation, this suggests that a multitude of (Ethofer et al., 2009; Peelen et al., 2010; Said et al., 2010; Kotz et al., different emotions are represented in the brain in a distinguish- 2013; for a review, see Kragel and LaBar, 2014). While the ‘clas- able manner, yet in partly overlapping regions: each emotion sic’ basic emotions have attracted most attention in psycho- state likely modulates different functional systems of the brain logical and neurophysiological studies, they constitute only a differently, as shown by distinct patterns measured with BOLD– small portion of the emotions humans universally experience fMRI, and the overall configuration of the regional activation (Edelstein and Shaver, 2007). Furthermore, accumulating behav- patterns defines the resulting emotion. While a set of ‘core’ ioral evidence suggests that other, non-basic emotions are also emotion processing areas in cortical midline regions, motor characterized by distinctive features in facial expressions areas, sensory areas, and subcortical regions are engaged during (Baron-Cohen et al., 2001; Shaw et al., 2005), bodily changes practically all emotions, the relative engagement of these areas (Nummenmaa et al., 2014a), and physiology (Kreibig, 2010; varies between emotions. This unique neural signature of each Kragel and LaBar, 2013). The present data corroborate these emotion might relate to the corresponding subjective feeling, as findings by showing that also emotions not considered as ‘basic’ Downloaded from https://academic.oup.com/scan/article-abstract/13/5/471/4956228 by Ed 'DeepDyve' Gillespie user on 21 June 2018 478 | Social Cognitive and Affective Neuroscience, 2018, Vol. 13, No. 5 Fig. 5. Activation maps showing the summed uncorrected t maps for each cluster obtained from the hierarchical clustering analysis in (A) cortical regions and (B) sub- cortical regions. Colors represent the three clusters: positive (red), negative basic (green), and negative social (blue) emotions. may each have distinctive brain activation patterns. The local readily reveal the actual neural organization of each emotion brain activity patterns underlying different emotions are most system, as the pattern classification only tells us that, on aver- probably to some extend variable across participants and reflect age and at the level measurable with BOLD–fMRI, the cortical/ individual responses, as brains are always intrinsically shaped neuronal pattern underlying each category differ enough to be by individual development and experiences. separated, whereas localizing the actual source of differences is If we consider discrete emotion systems as wide-spread, dis- more difficult. Therefore, we have complemented the pattern tinct neural activation distinct to each emotion state, successful classification analysis with visualization of different emotion pattern classification of brain states across emotions would pro- categories using GLM and clustering. Furthermore, the pattern vide support for separate neural systems for each emotion. In recognition techniques employed in this study cannot provide turn, constructivist emotion theories suggest that all emotions causal evidence for the existence of basic emotion systems: are generated by a shared set of fundamental functional sys- even if we find emotion-specific activation patterns for specific tems which are however not specific to emotional processing emotions, it does not prove that these patterns are strictly ne- per se (Kober et al., 2008; Lindquist et al., 2012). The present data cessary for the corresponding emotional state (Nummenmaa show that different emotions are associated with granular acti- and Saarima ¨ ki, 2017). The causality issues can be resolved, for vation changes across multiple functional systems, and their instance, in lesion studies or with brain stimulation techniques. spatially distributed configuration ultimately defines the spe- Moreover, the current analyses do not directly answer whether cific emotion at both psychological and behavioral levels the data are better described by discrete versus dimensional (Meaux and Vuilleumier, 2015). For instance, two emotions models of emotion. might share their somatosensory representations, but underly- If basic emotions were somehow ‘special’ or different from ing interoceptive representations could be different. Thus, the non-basic emotions at the neural level, we should observe (1) general configuration of the central and peripheral nervous sys- distinct neural activation patterns for basic emotions but not tem leads to distinct emotion states. for non-basic emotions, or (2) different (or perhaps additional) However, we stress that the current data cannot resolve neural systems underlying basic and non-basic emotions. Our whether these functional signatures of distinct emotions would classification results and cumulative maps show that both basic be necessary for each emotion, or whether the presently and non-basic emotions could be classified accurately, and they observed structure of the emotions is optimal, as the classifica- elicited activation in largely overlapping brain areas. On aver- tion solution is contingent on the employed a priori category age, classification accuracies were higher for basic emotions labels. Further, it must be noted classification analysis does not than for non-basic emotions. This suggests that while there Downloaded from https://academic.oup.com/scan/article-abstract/13/5/471/4956228 by Ed 'DeepDyve' Gillespie user on 21 June 2018 H. Saarima ¨ki et al. | 479 exists partially exclusive neural codes for a multitude of emo- et al., 2014), participate in self-relevant and introspective pro- tion states, as those investigated here, the canonical basic emo- cessing (Northoff and Bermpohl, 2004; Northoff et al., 2006), and tions are more discrete than the complex social emotions integrate information of internal, mental, and bodily states included in this study, suggesting that there are both universal (Northoff et al., 2006; Buckner and Carroll, 2007). We also found and experience-dependent components to emotions. Also, clas- consistent emotion-dependent activity in the brainstem, sification accuracies within single regions showed that only including periaqueductal grey, pons and medulla, for almost all basic emotions could be distinguished in some areas, including emotions. This activation might reflect the control of autonomic somatomotor regions (insula and supplementary motor area), nervous system’s reactions to different emotions (Critchley midline areas (PCC and ACC), and inferior frontal gyrus et al., 2005; Linnman et al., 2012) and/or covert activation of par- (Supplementary Figure S3). Another potential explanation for ticular motor programs (Blakemore et al., 2016). Also, subcortical the differences in classification accuracies between basic and regions including amygdala and thalamus showed distinct ac- non-basic emotions is that maybe there is a clearer cultural tivity patterns that differed between clusters. Both of these re- understanding or prototypical experience related with basic gions are related to salience processing and emotional arousal emotions, manifested as clearer patterns underlying these emo- (Anders et al., 2004; Adolphs, 2010; Damasio and Carvalho, 2013; tions. To test this hypothesis, we calculated prototypical experi- Kragel and LaBar, 2014) and show specific activation patterns ence scores for each emotion, defined as the sum of off- for basic emotions (Wang et al., 2014), findings that we now ex- diagonal elements in recollected experiential and neural tent also to non-basic emotions. similarity matrices separately, and compared the average proto- Second, somatomotor areas including premotor cortex, cere- typicality scores for basic and non-basic emotions. This allowed bellum (including vermis and the anterior lobe), globus pallidus, us to quantify the specificity vs confusion of recognition for dif- caudate nucleus, and posterior insula were activated during ferent emotion categories. There were no differences between most emotions, but according to the cluster visualizations espe- basic and non-basic in either the neural or recollected experien- cially during the processing of emotions that have a strong im- tial data, suggesting that both basic and non-basic emotions pact on action tendencies and avoidance-oriented behaviors have equally distinct neural and experiential underpinnings. (fear, disgust, sadness, shame, surprise; Frijda et al. 1989). These areas are engaged during emotion perception (Nummenmaa et al., 2008, 2012; Pichon et al., 2008), emotion regulation Correspondence between neural and phenomenological (Schutter and van Honk, 2009), and somatomotor processing organization of emotions and action preparation related to emotional processing (Kohler et al., 2002; Wicker et al., 2003; Mazzola et al., 2013). An explorative hierarchical clustering of emotion-specific neu- Third, anterior prefrontal cortex was activated especially ral patterns identified four clusters in the neural data that cor- during positive emotions (happiness, love, pride, gratitude, respond to positive emotions (pride, longing, happiness, longing) according with previous research linking anterior pre- gratitude, love), negative basic emotions (disgust, sadness, fear, frontal cortex with positive affect (Bartels and Zeki, 2004; Zahn shame) and negative social emotions (anger, guilt, contempt, et al., 2009; Vytal and Hamann, 2010). Fourth, negative emotions despair) and surprise (Figure 3). Clustering is an exploratory and such as guilt, contempt, anger, and despair clustered together, descriptive technique that does not allow strong inferences potentially reflecting their social dimension and interpersonal about underlying causal structure. Rather, clustering of neural aspects or their self-conscious nature. Especially, left hemi- similarities was used to reduce the dimensionality of the data sphere activation in orbitofrontal cortex connected to rewards for visualization purposes. Comparison of subjectively experi- and punishments (Kringelbach and Rolls, 2004), as well as in in- enced similarity of emotions and the similarity of the neural ferior frontal cortex and dorsolateral prefrontal cortex, which patterns suggested a direct link between the whole-brain neural subserve language and executive control (Poldrack et al., 1999; signatures of emotions and the corresponding subjective feel- Kane and Engle, 2002), and in anterior insula linked to process- ings: the more similar neural signatures two emotions had, the ing of social emotions (Lamm and Singer, 2010) was activated more similar they were experienced. This accords with prior during these emotions. Fifth, surprise did not resemble any of work suggesting that emotion-specific neural activation pat- the other emotions included in this study, but was instead clos- terns might explain why each emotion feels subjectively differ- est to the neutral state. This is in line with previous research ent (Damasio et al., 2000; Saarima ¨ki et al., 2016). Emotions might showing that surprise tends to separate from other emotions in constitute distinct activity patterns in regions processing differ- subjective ratings (Toivonen et al., 2012). ent emotion-related information, such as somatosensory (bod- Finally, we also found decreased activation in auditory areas ily sensations), motor (actions), as well as brainstem and and increased activation in visual areas during the imagery of thalamocortical loops (physiological arousal). Activation from all emotion categories, likely reflecting the offset of the auditory these areas is then integrated in the cortical midline, such inte- stimulation followed by mental imagery of the emotion- gration then giving rise to the interpretation of the subjective evoking situation (Ganis et al., 2004) and emotion-related modu- feeling (Northoff and Bermpohl, 2004; Northoff et al., 2006). lation of this activity (Holmes and Mathews, 2005; Thus, a subjective feeling of a specific emotion stems from the Nummenmaa et al., 2012; Kassam et al., 2013). net activation of different sub-processes, rather than solely on the basis of any single component of emotion. Limitations Organization of the affective space in the brain Despite the careful selection of the emotional stimuli and k- Our hierarchical clustering analysis and cumulative mapping means clustering suggesting clear categorical structure in the reveal how different patterns of activity may give rise to differ- evoked affect (see Figure 1), it is possible that the narratives did ent emotions. First, midline regions including ACC, PCC, and not fully capture the target emotion only, and might have eli- precuneus were activated during most emotions. These regions cited also a mixture of emotions. Yet these may (1) arise in dif- might code emotional valence (Colibazzi et al., 2010; Chikazoe ferent time points during the narratives and (2) be not as strong Downloaded from https://academic.oup.com/scan/article-abstract/13/5/471/4956228 by Ed 'DeepDyve' Gillespie user on 21 June 2018 480 | Social Cognitive and Affective Neuroscience, 2018, Vol. 13, No. 5 as the main target emotions (see Figure 1), thus average trial- References wise activations most likely pertains to the target emotion. Adolphs, R. (2002a). Recognizing emotion from facial expres- Despite this, the observed MVPA pattern may reflect whether sions: psychological and neurological mechanisms. Behavioral each narrative is dominated by one emotion, or at least show and Cognitive Neuroscience Reviews, 1, 21–62. the weighted influence of each emotion on a voxel activity. Adolphs, R. (2002b). Neural systems for recognizing emotion. Therefore, the successful classification per se shows that at least Current Opinion in Neurobiology, 12(2),169–77. the target emotions were successfully elicited, yet, better classi- Adolphs, R. (2010). What does the amygdala contribute to social fication accuracies could potentially be reached if stimuli could cognition?. Annals of the New York Academy of Sciences, 1191, target more selectively one category at the time. 42–61. Emotions were elicited using narrative-guided emotional Adolphs, R. (2017). How should neuroscience study emotions? by imagery. The narrative stimuli across different categories of distinguishing emotion states, concepts, and experiences. emotion differed along dimensions that are not of interest (for Social Cognitive and Affective Neuroscience, 12, 24–31. instance, different words or different kinds of imagery Adolphs, R., Damasio, H., Tranel, D., Cooper, G., Damasio, A.R. prompted), thus creating variation both within and between (2000). A role for somato-sensory cortices in the visual recogni- emotion categories, though the latter is unlikely to systematic- tion of emotion as revealed by three-dimensional lesion map- ally correlate with particular emotions. Moreover, any within- ping. Journal of Neuroscience, 20, 2683–90. category variation would probably work against the classifier by Anders, S., Lotze, M., Erb, M., Grodd, W., Birbaumer, N. (2004). lowering the classification accuracy. Despite this, we observed Brain activity underlying emotional valence and arousal: a re- above chance level accuracy for all emotion categories except sponse-related fMRI study. Human Brain Mapping, 23(4), 200–9. longing and shame. However, it is possible that there were dif- Baron-Cohen, S., Wheelwright, S., Hill, J., Raste, Y., Plumb, I. ferences between categories in, for instance, evoked emotional (2001). The “Reading the Mind in the Eyes” test revised version: imagery, that vary across emotion categories in some uncon- a study with normal adults, and adults with Asperger syn- trolled manner, thus potentially in part affecting the classifica- drome or high-functioning autism. Journal of Child Psychology tion accuracy. This may constitute a limitation of our study but and Psychiatry, 42(2), 241–51. is also inherently related to variations in scenarios and ap- Barrett, L.F. (2017). The theory of constructed emotion: an active praisal dimensions that constitute distinct emotion types. inference account of interoception and categorization. Social Cognitive and Affective Neuroscience, 12(11), 1833–23. Bartels, A., Zeki, S. (2004). The neural correlates of maternal and romantic love. Neuroimage, 21(3), 1155–66. Conclusions Blakemore, R.L., Rieger, S.W., Vuilleumier, P. (2016). Negative Our results characterize the distinct and distributed neural sig- emotions facilitate isometric force through activation of pre- natures of multiple emotional states. Different emotions result frontal cortex and periaqueductal gray. Neuroimage, 124(Pt A), from differential activation patterns within a shared neural cir- 627–40. cuitry, mostly consisting of midline regions, motor areas and Buckner, R.L., Carroll, D.C. (2007). Self-projection and the brain. subcortical regions. The more similar the neural underpinnings Trends in Cognitive Science, 11(2), 49–57. of these emotions, the more similarly they also are experienced. Chandrasekhar, P.V., Capra, C.M., Moore, S., Noussair, C., Berns, We suggest that the relative engagement of different parts of G.S. (2008). Neurobiological regret and rejoice functions for this system defines the current emotional state. aversive outcomes. Neuroimage, 39(3), 1472–84. Chikazoe, J., Lee, D.H., Kriegeskorte, N., Anderson, A.K. (2014). Population coding of affect across stimuli, modalities and indi- Supplementary data viduals. Nature Neuroscience, 17(8), 1114–22. Colibazzi, T., Posner, J., Wang, Z., et al. (2010). Neural systems Supplementary data are available at SCAN online. subserving valence and arousal during the experience of induced emotions. Emotion, 10(3), 377–89. Coricelli, G., Critchley, H.D., Joffily, M., O’Doherty, J.P., Sirigu, A., Acknowledgements Dolan, R.J. (2005). Regret and its avoidance: a neuroimaging study of choice behavior. Nature Neuroscience, 8(9), 1255–62. We acknowledge the computational resources provided by Costa, V.D., Lang, P.J., Sabatinelli, D., Versace, F., Bradley, M.M. the Aalto Science-IT project. We thank Marita Kattelus for (2010). Emotional imagery: assessing pleasure and arousal in her help in fMRI data collection and Matthew Hudson for a the brain’s reward circuitry. Human Brain Mapping, 31(9), language check of the manuscript. 1446–57. Critchley, H.D., Rotshtein, P., Nagai, Y., O’Doherty, J., Mathias, C.J., Dolan, R.J. (2005). Activity in the human brain predicting Funding differential heart rate responses to emotional facial expres- sions. Neuroimage, 24(3), 751–62. This study was supported by the aivoAALTO project of the Cunningham, W.A. (2013). Introduction to special section: psy- Aalto University, Academy of Finland (#265917 and #294897 chological constructivism. Emotion Review, 5(4), 333–4. to L.N., and #138145 and #276643 to I.P.J.), ERC Starting Grant Damasio, A.R. (1999). The Feeling of What Happens: Body and (#313000 to L.N.); Finnish Cultural Foundation (#00140220 to Emotion in the Making of Consciousness. San Diego: Houghton H.S.); and the Swiss National Science Foundation National Mifflin Harcourt. Center of Competence in Research for Affective Sciences Damasio, A.R., Grabowski, T.J., Bechara, A., et al. (2000). (#51NF40-104897 to P.V.). Subcortical and cortical brain activity during the feeling of Conflict of interest. None declared. self-generated emotions. Nature Neuroscience, 3(10), 1049–56. Downloaded from https://academic.oup.com/scan/article-abstract/13/5/471/4956228 by Ed 'DeepDyve' Gillespie user on 21 June 2018 H. Saarima ¨ki et al. | 481 Kober, H., Barrett, L.F., Joseph, J., Bliss-Moreau, E., Lindquist, K., Damasio, A.R., Carvalho, G.B. (2013). The nature of feelings: evo- lutionary and neurobiological origins. Nature Reviews Wager, T.D. (2008). Functional grouping and Neuroscience, 14(2), 143–52. cortical-subcortical interactions in emotion: a meta-analysis de Gelder, B., Snyder, J., Greve, D., Gerard, G., Hadjikhani, N. of neuroimaging studies. Neuroimage, 42(2), 998–1031. (2004). Fear fosters flight: a mechanism for fear contagion Kohler, E., Keysers, C., Umilta, M.A., Fogassi, L., Gallese, V., when perceiving emotion expressed by a whole body. Rizzolatti, G. (2002). Hearing sounds, understanding actions: Proceedings of the National Academy of Sciences of the USA, 101(47), action representation in mirror neurons. Science, 297(5582), 16701–6. 846–8. Edelstein, R.S., Shaver, P.R. (2007). A cross-cultural examination Kotz, S.A., Kalberlah, C., Bahlmann, J., Friederici, A.D., Haynes, of lexical studies of self-conscious emotions. In: Tracy, J.L., J.D. (2013). Predicting vocal emotion expressions from the Robins, R.W., Tangney, J.P., editors. The Self-Conscious Emotions: human brain. Human Brain Mapping, 34(8), 1971–81. Theory and Research. New York: Guilford Press. 194–208. Kragel, P.A., LaBar, K.S. (2013). Multivariate pattern classification Eklund, A., Nichols, T.E., Knutsson, H. (2016). Cluster failure: why reveals autonomic and experiential representations of discrete fMRI inferences for spatial extent have inflated false-positive emotions. Emotion, 13(4), 681–90. Kragel, P.A., LaBar, K.S. (2014). Advancing emotion theory with rates. Proceedings of the National Academy of Sciences of the USA, 113(28), 7900–5. multivariate pattern classification. Emotion Review, 6(2), 160–74. Ekman, P. (1992). An argument for basic emotions. Cognition Kragel, P.A., LaBar, K.S. (2015). Multivariate neural biomarkers of Emotion, 6(3–4), 169–200. emotional states are categorically distinct. Social Cognitive and Ekman, P. (1999). Facial expressions. In: Dalgleish, T., Power. M., Affective Neuroscience, 10(11), 1437–48. editors. Handbook of Cognition and Emotion. New York: John Kragel, P.A., LaBar, K.S. (2016). Decoding the nature of emotion in Wiley & Sons Ltd. 301–20. the brain. Trends in Cognitive Science, 20(6), 444–55. Ekman, P., Cordaro, D. (2011). What is meant by calling emotions Kreibig, S.D. (2010). Autonomic nervous system activity in emo- basic. Emotion Review, 3(4), 364–70. tion: a review. Biological Psychology, 84(3), 394–421. Eryilmaz, H., Van De Ville, D., Schwartz, S., Vuilleumier, P. (2011). Kringelbach, M.L., Rolls, E.T. (2004). The functional neuroanat- Impact of transient emotions on functional connectivity dur- omy of the human orbitofrontal cortex: evidence from neuroi- ing subsequent resting state: a wavelet correlation approach. maging and neuropsychology. Progress in Neurobiology, 72(5), Neuroimage, 54(3), 2481–91. 341–72. Ethofer, T., Van De Ville, D., Scherer, K., Vuilleumier, P. (2009). Lamm, C., Singer, T. (2010). The role of anterior insular cortex in Decoding of emotional information in voice-sensitive cortices. social emotions. Brain Structure and Function, 214(5–6), 579–91. Current Biology, 19(12), 1028–33. Lewis, M.D., Liu, Z.X. (2011). Three time scales of neural Fischer, A., LaFrance, M. (2015). What drives the smile and the self-organization underlying basic and nonbasic emotions. tear: why women are more emotionally expressive than men. Emotion Review, 3(4), 416–23. Emotion Review, 7(1), 22–9. Lindquist, K.A., Wager, T.D., Kober, H., Bliss-Moreau, E., Barrett, Frijda, N.H., Kuipers, P., Ter Schure, E. (1989). Relations among L.F. (2012). The brain basis of emotion: a meta-analytic review. emotion, appraisal, and emotional action readiness. Journal of Behavioral and Brain Science, 35(3), 121–43. Personality and Social Psychology, 57(2), 212–28. Linnman, C., Moulton, E.A., Barmettler, G., Becerra, L., Borsook, Ganis, G., Thompson, W.L., Kosslyn, S.M. (2004). Brain areas D. (2012). Neuroimaging of the periaqueductal gray: state of underlying visual mental imagery and visual perception: an the field. Neuroimage, 60(1), 505–22. fMRI study. Cognitive Brain Research, 20(2), 226–41. Mazzola, V., Vuilleumier, P., Latorre, V., et al. (2013). Effects of Grossman, M., Wood, W. (1993). Sex differences in intensity of emotional contexts on cerebello-thalamo-cortical activity dur- emotional experience: a social role interpretation. Journal of ing action observation. PLoS ONE, 8(9), e75912. Personality and Social Psychology, 65(5), 1010–22. Meaux, E., Vuilleumier, P. 2015. Emotion perception and elicit- Hamann, S. (2012). Mapping discrete and dimensional emotions ation. In: Toga, A.W., editor. Brain Mapping: An Encyclopedic onto the brain: controversies and consensus. Trends in Reference. Oxford (UK), Elsevier. Cognitive Science, 16(9), 458–66. Northoff, G., Bermpohl, F. (2004). Cortical midline structures and Hofer, A., Siedentopf, C.M., Ischebeck, A., et al. (2006). Gender dif- the self. Trends in Cognitive Science, 8(3), 102–7. ferences in regional cerebral activity during the perception of Northoff, G., Heinzel, A., de Greck, M., Bermpohl, F., Dobrowolny, emotion: a functional MRI study. Neuroimage, 32(2), 854–62. H., Panksepp, J. (2006). Self-referential processing in our Holmes, E., Mathews, A. (2005). Mental imagery and emotion: a brain—a meta-analysis of imaging studies on the self. special relationship?. Emotion, 5(4), 489–97. Neuroimage, 31(1), 440–57. Jenkinson, M., Bannister, P.R., Brady, J.M., Smith, S.M. (2002). Nummenmaa, L., Hirvonen, J., Parkkola, R., Hietanen, J.K. (2008). Improved optimization for the robust and accurate linear Is emotional contagion special? An fMRI study on neural sys- registration and motion correction of brain images. tems for affective and cognitive empathy. Neuroimage, 43(3), Neuroimage, 17(2), 825–41. 571–80. Jenkinson, M., Beckmann, C.F., Behrens, T.E.J., Woolrich, M.W., Nummenmaa, L., Glerean, E., Viinikainen, M., Jaaskelainen, I.P., Smith, S.M. (2012). FSL. Neuroimage, 62(2), 782–90. Hari, R., Sams, M. (2012). Emotions promote social interaction Kane, M.J., Engle, R.W. (2002). The role of prefrontal cortex in by synchronizing brain activity across individuals. Proceedings working-memory capacity, executive attention, and general of the National Academy of Sciences of USA, 109(24), 9599–604. fluid intelligence: an individual-differences perspective. Nummenmaa, L., Glerean, E., Hari, R., Hietanen, J.K. (2014a). Psychon B Review, 9(4), 637–71. Bodily maps of emotions. Proceedings of the National Academy of Kassam, K.S., Markey, A.R., Cherkassky, V.L., Loewenstein, G., Sciences of the USA, 111(2), 646–51. Just, M.A. (2013). Identifying emotions on the basis of neural Nummenmaa, L., Saarima ¨ ki, H., Glerean, E., Gotsopoulos, A., Hari, R., Sams, M. (2014b). Emotional speech synchronizes activation. PLoS One, 8(6), e66032. Downloaded from https://academic.oup.com/scan/article-abstract/13/5/471/4956228 by Ed 'DeepDyve' Gillespie user on 21 June 2018 482 | Social Cognitive and Affective Neuroscience, 2018, Vol. 13, No. 5 and prefrontal cortex on recognizing facial expressions of brains across listeners and engages large-scale dynamic brain networks. Neuroimage, 102, 498–509. complex emotions. Journal of Cognitive Neuroscience, 17(9), Nummenmaa, L., Saarima ¨ ki, H. (2017). Emotions as discrete pat- 1410–9. terns of systemic activity. Neuroscience Letters, doi: 10.1016/ Simon-Thomas, E.R., Godzik, J., Castle, E., et al. (2012). An j.neulet.2017.07.012. fMRI study of caring vs self-focus during induced compassion Panksepp, J. (1982). Toward a general psychobiological theory of and pride. Social Cognitive and Affective Neuroscience, 7(6), emotions. Behavioral and Brain Science, 5(03), 407–22. 635–48. Panksepp, J., Watt, D. (2011). What is basic about basic emotions? Smith, S.M. (2002). Fast robust automated brain extraction. Lasting lessons from affective neuroscience. Emotion Review, Human Brain Mapping, 17(3), 143–55. 3(4), 387–96. Takahashi, H., Matsuura, M., Koeda, M., et al. (2008). Brain activa- Peelen, M.V., Atkinson, A.P., Vuilleumier, P. (2010). Supramodal tions during judgments of positive self-conscious emotion and representations of perceived emotions in the human brain. positive basic emotion: pride and joy. Cerebral Cortex, 18(4), Journal of Neuroscience, 30(30), 10127–34. 898–903. Pichon, S., de Gelder, B., Grezes, J. (2008). Emotional modulation Toivonen, R., Kivela ¨ , M., Sarama ¨ ki, J., Viinikainen, M., Vanhatalo, of visual and motor areas by dynamic body expressions of M., Sams, M. (2012). Networks of emotion concepts. PLoS One, anger. Society for Neuroscience, 3(3–4), 199–212. 7(1), e28883. Poldrack, R.A., Wagner, A.D., Prull, M.W., Desmond, J.E., Glover, Trost, W., Ethofer, T., Zentner, M., Vuilleumier, P. (2012). G.H., Gabrieli, J.D. (1999). Functional specialization for seman- Mapping aesthetic musical emotions in the brain. Cerebral tic and phonological processing in the left inferior prefrontal Cortex, 22(12), 2769–83. cortex. Neuroimage, 10(1), 15–35. Vuilleumier, P., Trost, W. (2015). Music and emotions: from en- Polyn, S.M., Natu, V.S., Cohen, J.D., Norman, K.A. (2005). chantment to entrainment. Annals of the New York Academy of Category-specific cortical activity precedes retrieval during Sciences, 1337, 212–22. memory search. Science, 310(5756), 1963–6. Vytal, K., Hamann, S. (2010). Neuroimaging support for discrete Reyes-Vargas, M., Sa ´ nchez-Gutie ´ rrez, M., Fuginer, L., et al. (2013). neural correlates of basic emotions: a voxel-based meta-ana- Hierarchical clustering and classification of emotions in lysis. Journal of Cognitive Neuroscience, 22(12), 2864–85. human speech using confusion matrices. In: International Wagner, U., N’Diaye, K., Ethofer, T., Vuilleumier, P. (2011). Conference on Speech and Computer. Springer International Guilt-specific processing in the prefrontal cortex. Cerebral Publishing. 162–9. Cortex, 21(11), 2461–70. Rosvall, M., Bergstrom, C.T. (2010). Mapping change in large net- Wang, S., Tudusciuc, O., Mamelak, A.N., Ross, I.B., Adolphs, R., works. PLoS One, 5(1), e8694. Rutishauser, U. (2014). Neurons in the human amygdala select- Russell, J.A. (2003). Core affect and the psychological construc- ive for perceived emotion. Proceedings of the National Academy of tion of emotions. Psychological Review, 110(1), 145–72. Sciences of the USA, 111(30), E3110–9. Saarima ¨ ki, H., Gotsopoulos, A., Ja ¨a ¨ skela ¨ inen, I.P., et al. (2016). Wicker, B., Keysers, C., Plailly, J., Royet, J.P., Gallese, V., Discrete neural signatures of basic emotions. Cerebral Cortex, Rizzolatti, G. (2003). Both of us disgusted in my insula: the 26(6), 2563–73. common neural basis of seeing and feeling disgust. Neuron, Said, C.P., Moore, C.D., Engell, A.D., Todorov, A., Haxby, J.V., 40(3), 655–64. (2010). Distributed representations of dynamic facial expres- Zahn, R., Moll, J., Paiva, M., et al. (2009). The neural basis of sions in the superior temporal sulcus. Journal of Vision, 1, 11. human social values: evidence from functional MRI. Cerebral Schutter, D.J.L.G., van Honk, J. (2009). The cerebellum in emotion Cortex, 19(2), 276–83. regulation: a repetitive transcranial magnetic stimulation Zhang, Y., Brady, M., Smith, S. (2001). Segmentation of brain MR study. Cerebellum, 8(1), 28–34. images through a hidden Markov random field model and the Shaw, P., Bramham, J., Lawrence, E.J., Morris, R., Baron-Cohen, S., expectation-maximization algorithm. IEEE Transactions of the David, S. (2005). Differential effects of lesions of the amygdala Medical Imaging, 20(1), 45–57. Downloaded from https://academic.oup.com/scan/article-abstract/13/5/471/4956228 by Ed 'DeepDyve' Gillespie user on 21 June 2018

Journal

Social Cognitive and Affective NeuroscienceOxford University Press

Published: May 16, 2018

There are no references for this article.