Top Banner
Cognitive Brain Research 11 (2001) 213–226 www.elsevier.com / locate / bres Research report Neural substrates of facial emotion processing using fMRI b,e b,c b,d,e Marilyn L. Kesler/West , Anders H. Andersen , Charles D. Smith , b,d,f a,h e,g a,d,e, * Malcolm J. Avison , C. Ervin Davis , Richard J. Kryscio , Lee X. Blonder a Department of Behavioral Science, University of Kentucky, Lexington, KY 40536, USA b Magnetic Resonance Imaging and Spectroscopy Center, University of Kentucky, Lexington, KY 40536, USA c Department of Anatomy and Neurobiology, University of Kentucky, Lexington, KY 40536, USA d Department of Neurology, University of Kentucky, Lexington, KY 40536, USA e Sanders-Brown Center on Aging, University of Kentucky, Lexington, KY 40536, USA f Department of Biochemistry, University of Kentucky, Lexington, KY 40536, USA g Department of Statistics, University of Kentucky, Lexington, KY 40536, USA h Department of Psychology, University of Kentucky, Lexington, KY 40536, USA Accepted 24 October 2000 Abstract We identified human brain regions involved in the perception of sad, frightened, happy, angry, and neutral facial expressions using functional magnetic resonance imaging (fMRI). Twenty-one healthy right-handed adult volunteers (11 men, 10 women; aged 18–45; mean age 21.6 years) participated in four separate runs, one for each of the four emotions. Participants viewed blocks of emotionally expressive faces alternating with blocks of neutral faces and scrambled images. In comparison with scrambled images, neutral faces activated the fusiform gyri, the right lateral occipital gyrus, the right superior temporal sulcus, the inferior frontal gyri, and the amygdala / entorhinal cortex. In comparisons of emotional and neutral faces, we found that (1) emotional faces elicit increased activation in a subset of cortical regions involved in neutral face processing and in areas not activated by neutral faces; (2) differences in activation as a function of emotion category were most evident in the frontal lobes; (3) men showed a differential neural response depending upon the emotion expressed but women did not. 2001 Elsevier Science B.V. All rights reserved. Theme: Neural basis of behavior Topic: Motivation and emotion Keywords: Functional imaging; Facial expression; Emotional processing 1. Introduction temporal lobe [16], the right mesial occipital and right inferior parietal regions [2], the right somatosensory cortex The ability to decode emotional information on the face [1], and the amygdala [3,4,17,72]. is critical to human communication. Clinical and ex- Neuroimaging studies of facial emotion perception perimental research in humans and in non-human primates suggest that there may be two parallel pathways, one for suggests right hemisphere specialization [10– conscious processing of stimuli and one for unconscious 12,22,26,35,47,53,54]. Electrical stimulation, lesion, and processing of simple and associative stimulus features microelectrode recording studies have identified several [45,52]. Several imaging studies have implicated the intrahemispheric brain regions involved in facial affect amygdala in the processing of both seen and unseen fearful processing. These regions include the right temporal lobe faces in humans [6,14,49–51,58,59,70]. Morris et al. [52] [27,55,63], the right or left basal ganglia and anterior identified a subcortical pathway from the midbrain and thalamus to the right amygdala that is involved in the processing of unseen visual events (e.g. fear-conditioned *Corresponding author. 101 Sanders-Brown Center on Aging, Uni- faces). Their data suggest that fusiform and orbitofrontal versity of Kentucky, Lexington, KY 40536-0230, USA. Tel.: 11-606- regions may be components of a pathway subserving 257-1412, ext. 271; fax: 11-606-323-2866. E-mail address: [email protected] (L.X. Blonder). conscious identification. In a more recent functional mag- 0926-6410 / 01 / $ – see front matter 2001 Elsevier Science B.V. All rights reserved. PII: S0926-6410(00)00073-2
14

Neural substrates of facial emotion processing using fMRI

Jan 17, 2023

Download

Documents

Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Neural substrates of facial emotion processing using fMRI

Cognitive Brain Research 11 (2001) 213–226www.elsevier.com/ locate /bres

Research report

Neural substrates of facial emotion processing using fMRIb,e b,c b,d,eMarilyn L. Kesler /West , Anders H. Andersen , Charles D. Smith ,

b,d,f a,h e,g a,d,e ,*Malcolm J. Avison , C. Ervin Davis , Richard J. Kryscio , Lee X. BlonderaDepartment of Behavioral Science, University of Kentucky, Lexington, KY 40536, USA

bMagnetic Resonance Imaging and Spectroscopy Center, University of Kentucky, Lexington, KY 40536, USAcDepartment of Anatomy and Neurobiology, University of Kentucky, Lexington, KY 40536, USA

dDepartment of Neurology, University of Kentucky, Lexington, KY 40536, USAeSanders-Brown Center on Aging, University of Kentucky, Lexington, KY 40536, USA

fDepartment of Biochemistry, University of Kentucky, Lexington, KY 40536, USAgDepartment of Statistics, University of Kentucky, Lexington, KY 40536, USA

hDepartment of Psychology, University of Kentucky, Lexington, KY 40536, USA

Accepted 24 October 2000

Abstract

We identified human brain regions involved in the perception of sad, frightened, happy, angry, and neutral facial expressions usingfunctional magnetic resonance imaging (fMRI). Twenty-one healthy right-handed adult volunteers (11 men, 10 women; aged 18–45;mean age 21.6 years) participated in four separate runs, one for each of the four emotions. Participants viewed blocks of emotionallyexpressive faces alternating with blocks of neutral faces and scrambled images. In comparison with scrambled images, neutral facesactivated the fusiform gyri, the right lateral occipital gyrus, the right superior temporal sulcus, the inferior frontal gyri, and theamygdala /entorhinal cortex. In comparisons of emotional and neutral faces, we found that (1) emotional faces elicit increased activationin a subset of cortical regions involved in neutral face processing and in areas not activated by neutral faces; (2) differences in activationas a function of emotion category were most evident in the frontal lobes; (3) men showed a differential neural response depending uponthe emotion expressed but women did not. 2001 Elsevier Science B.V. All rights reserved.

Theme: Neural basis of behavior

Topic: Motivation and emotion

Keywords: Functional imaging; Facial expression; Emotional processing

1. Introduction temporal lobe [16], the right mesial occipital and rightinferior parietal regions [2], the right somatosensory cortex

The ability to decode emotional information on the face [1], and the amygdala [3,4,17,72].is critical to human communication. Clinical and ex- Neuroimaging studies of facial emotion perceptionperimental research in humans and in non-human primates suggest that there may be two parallel pathways, one forsuggests right hemisphere specialization [10– conscious processing of stimuli and one for unconscious12,22,26,35,47,53,54]. Electrical stimulation, lesion, and processing of simple and associative stimulus featuresmicroelectrode recording studies have identified several [45,52]. Several imaging studies have implicated theintrahemispheric brain regions involved in facial affect amygdala in the processing of both seen and unseen fearfulprocessing. These regions include the right temporal lobe faces in humans [6,14,49–51,58,59,70]. Morris et al. [52][27,55,63], the right or left basal ganglia and anterior identified a subcortical pathway from the midbrain and

thalamus to the right amygdala that is involved in theprocessing of unseen visual events (e.g. fear-conditioned

*Corresponding author. 101 Sanders-Brown Center on Aging, Uni-faces). Their data suggest that fusiform and orbitofrontalversity of Kentucky, Lexington, KY 40536-0230, USA. Tel.: 11-606-regions may be components of a pathway subserving257-1412, ext. 271; fax: 11-606-323-2866.

E-mail address: [email protected] (L.X. Blonder). conscious identification. In a more recent functional mag-

0926-6410/01/$ – see front matter 2001 Elsevier Science B.V. All rights reserved.PI I : S0926-6410( 00 )00073-2

Page 2: Neural substrates of facial emotion processing using fMRI

214 M.L. Kesler /West et al. / Cognitive Brain Research 11 (2001) 213 –226

netic resonance imaging (fMRI) study, Critchley et al. [20] 2. Materials and methodsfound that explicit processing of facial expressions ofemotion activated temporal cortex in comparison to im- 2.1. Subjectsplicit processing which evoked significantly greater activi-ty in the amygdala. Twenty-one healthy adult volunteers (11 men, 10

While investigators are beginning to map out the women; aged 18–45; mean age 21.6 years) gave informedpathways engaged in conscious and unconscious emotional consent under an institutionally approved protocol. Allprocessing, little is known about how these pathways participants were right-handed and reported no first-degreecompare across different emotions, especially the pathways left-handed biological relatives. Exclusions were currentinvolved in conscious processing. Many studies such as cigarette smoking, visual acuity poorer than 20/25, orthose by Blair et al. [8], Morris et al. [49], Phillips et al. medical conditions that could affect the central nervous[58], and Sprengelmeyer et al. [67] required subjects to system as determined by a board-certified neurologistperform a gender discrimination task with a button-press (CDS).response when emotional faces were presented. Thusattention is directed away from the emotions, and the 2.2. Visual stimuliregions activated are likely to differ from those engaged byfully conscious emotional face processing [20]. Stimuli consisted of black and white still photographs of

Only a few neuroimaging studies have examined the the faces of approximately 125 individuals expressingresponse to more than one facial emotion without using a happy, sad, angry, frightened, and neutral emotions. Webackward masking paradigm or a task unrelated to emo- used published sets [13,25,38] and additional photographstional face perception. Using fMRI, Phillips et al. [57] shot and standardized in our laboratory. The latter werefound that happy face processing was associated with included in the experiment if at least 75% correct classifi-increased activity in the left anterior cingulate, bilateral cation of emotional expression was achieved in a group ofmedial frontal region, bilateral posterior cingulate, left independent volunteer raters who were not study subjects.supramarginal gyrus, and right putamen. They did not find Subjects were pre-exposed to the types of stimuli but notany areas to be significantly more active in comparisons the specific faces that were used during the fMRI experi-between sad and neutral faces. Also using fMRI, Breiter et ment. The face stimuli were grouped by expression typeal. [14] found bilateral activation of the amygdala and into blocks of nine, each block containing three to fourfusiform gyrus associated with viewing fearful versus men and five to six women, one of which was African-neutral faces. They also found activation of the left American, and one of which was Asian-American. Eachamygdala and bilateral fusiform in comparisons of happy block of expressive faces was matched to a block of

133and neutral faces. In a Xe inhalation study, Gur et al. neutral faces, with seven out of nine of the same in-[33] found that happy and sad emotional face discrimina- dividuals included in both blocks. Each expressive facetion tasks activated the right parietal lobe more than did an was presented only once.age discrimination task. Furthermore, the happy discrimi- For the control condition, ‘mosaic’ or scrambled imagesnation task activated the left frontal region relative to the were derived from the expressive and neutral face imagessad discrimination task. In sum, the literature identifying using Matlab and Adobe Photoshop 4.0 software. Thebrain regions engaged in the conscious processing of original image, which had a resolution of 72–75 pixels perdifferent facial emotions is limited and inconclusive. inch, was parcellated into 14314 pixel rectangles of

Our fMRI study was designed to investigate explicit uniform intensity with a low-pass filter. A linear histogramprocessing of happy, sad, angry, frightened, and neutral stretch was used to restore the contrast, and the rectanglefaces by normal volunteers. Subjects were instructed to locations within the image were then randomized. This‘concentrate on each person’s expression’ when viewing a choice of baseline was made to control for the early visualseries of emotionally expressive faces. We did not require a operations involved in the face-processing conditions, sobehavioral response such as a button press because we did that we might examine what is specific to the processing ofnot wish to force subjects to engage in an artificial decision faces. The scrambled images did not contain discerniblemaking task [60]. For each subject, four separate runs facial features. The size and overall brightness of eachcorresponding to the four different emotions were pre- scrambled image were equal to the size and brightness ofsented. We predicted that activations observed during the the face image from which it was derived. We did notemotion conditions, over and above those seen during the equate the local contrast or the spatial frequency dis-neutral condition, would occur in the left and right tribution of the scrambled image to the original.fusiform gyri and in the left and right amygdalae and thatthese effects would be greater in the right hemisphere than 2.3. Presentation apparatusin the left. We also sought to compare and contrast thebrain regions involved in the perception of the four Stimuli were displayed in sequence using MacStimdifferent emotions, although the existing literature did not software (David Darby) onto a rear-projection screenpermit us to make specific predictions in this regard. placed at the foot of the MR scanner bed. Subjects viewed

Page 3: Neural substrates of facial emotion processing using fMRI

M.L. Kesler /West et al. / Cognitive Brain Research 11 (2001) 213 –226 215

this screen from within the bore of the magnet by means of subjects. The two scanning sessions were approximately 1a mirror placed on the head coil approximately 4 inches week apart.from the subject’s face. The images subtended a visual The third and last run of the second scanning sessionangle of 58 horizontal and 68 vertical. was a null condition in which a series of 122 consecutive

neutral faces were presented for four seconds each. Sub-2.4. Paradigm jects were instructed to concentrate on each person’s

expression. This condition was used to sample an fMRIWhen viewing a scrambled image, the subject was time series that was influenced by physiological rhythms,

instructed to focus on the image and to try not to think e.g. respiration or heart rate, but not influenced by expo-about other things. When viewing a face, the subject was sure to periodically alternating tasks. The null conditioninstructed to concentrate on each person’s expression. was used to evaluate the rate of false positive activations.Instructions were given just before the start of every run.Images were displayed continuously for 4 s each.

At the start of every run, seven scrambled images 2.5. Subject debriefingfollowed by seven neutral faces were presented as a ‘sham’cycle; the corresponding MR scans were discarded. Sub- Immediately after the second scanning session, subjectsjects were then presented with alternating, 36-s epochs of were asked a series of open-ended questions about theirnine scrambled images, nine neutral faces, and nine experience in the scanner. All subjects then performed anemotionally expressive faces. The order of the 36-s epochs emotion labeling task. Every face that had been shownwas counterbalanced both within and across subjects. during the four emotion runs was singly presented. Sub-Twelve 36-s epochs partitioned into four cycles comprised jects were asked to categorize each face as neutral, happy,a 7.2-min run (Fig. 1). Every subject viewed four of these sad, angry, afraid, ‘other,’ or ‘don’t know’ and to rate theruns, two during day one and two during day two. Each intensity of each expression on a five-point scale. Therun contained a single emotion (happy, sad, angry, or percent correct and the mean intensity rating were tabu-frightened) along with neutral and scrambled conditions. lated for each emotion in each subject. The mean intensityThe order of the emotion runs was counterbalanced across was computed using only the correctly labeled faces.

Fig. 1. Stimulus and presentation paradigm examples. Scrambled images and black and white photographs of faces were displayed for 4 s each. Part A ofthis figure shows a scrambled image (left), a neutral face (middle), and an angry face (right). The stimuli were grouped into 36-s blocks of nine scrambledimages (S), nine neutral faces (N), and nine angry faces (A) during the angry emotion run. One example of the blocked presentation for an entire run isdepicted in Part B. To counterbalance within-run order effects, other sequences were also used.

Page 4: Neural substrates of facial emotion processing using fMRI

216 M.L. Kesler /West et al. / Cognitive Brain Research 11 (2001) 213 –226

2.6. Image acquisition if the EPI image intensity for all 21 subjects at thatlocation exceeded a level corresponding to half the approx-

Functional magnetic resonance images were collected on imate mean intensity of white matter. This criteriona Siemens Magnetom VISION 1.5 Tesla imaging system ensures adequate signal-to-noise characteristics for theusing a circularly polarized transmit / receive head coil. detection of activation and enables us to exclude voxels inFoam padding was used to stabilize head position. Blood regions of low signal intensity due to susceptibility effects.oxygen level-dependent (BOLD) signal intensity data were Peak t value, peak location, and average fractional signalcollected from 16 axial, 8-mm-thick slices covering the change were computed for each cluster.

*entire cerebrum and upper cerebellum/brainstem. A T - The choice of threshold was based on a modification of2

weighted gradient echo EPI sequence with minimal inflow the standard r-to-t conversion [40] that included adjust-weighting was used with these acquisition parameters: ments to the number of degrees of freedom based on theTR/TE54000/45 ms, F.A.5908, matrix51283128, number of subjects and the effect of spatial smoothing. InF.O.V.52563256 mm. The images were motion corrected addition, the null condition was used to create a histogramwith SPM’96 [28]. A 3D MPRAGE sequence (TR/TE/ showing the distribution of mean r values in the null groupTI511.4 ms/4.4 ms/300 ms, F.A.588, 131 mm in-plane image.resolution, slice thickness52 mm) was used to collect An ROI analysis was also performed in order toanatomical images for the localization of functional activi- compare responses between the four emotion conditions.ty and for the registration of the fMRI data sets to the For each subject, a composite image contrasting the meanstereotactic space of Talairach and Tournoux [68]. An of the four emotion blocks with the mean of all of theanatomical reference image consisting of the mean of the neutral blocks was made ((anger1frightened1happy1sad)intensity-normalized MPRAGE images from all 21 sub- with neutral). The composite images from all 21 subjectsjects was used to display group mean activation maps. were then averaged to create an overall group image. Each

subject’s composite image was used to locate the maxi-2.7. Data analysis mum positive correlation peak for the fusiform gyrus, the

superior temporal sulcus, the inferior frontal gyrus, theThe fMRI data were analyzed with AFNI software using precentral sulcus, and the medial superior frontal gyrus in

the cross-correlation method [7,19]. For each subject and each hemisphere. To locate these peaks, a sphere withfor each pairing of task states, correlation coefficients were radius 10 mm centered on the maximum positive correla-computed voxel by voxel using a box-car reference tion peak for each region on the overall group image waswaveform with no time lag. The first 4 s of each stimulus used as the search area. Additional search area constraintsdata block were omitted to allow for BOLD-response rise were based on brain anatomy. For example, voxels in theand fall times. The resulting functional parametric maps cerebellum and voxels outside of the brain (or in regions ofcontained a value for the mean fractional signal change susceptibility artifact) were excluded from the search area(DS /S) and an associated correlation coefficient (r value) by masking. Masking was based on a combination ofat each voxel. These maps, referenced to each subject’s signal intensity thresholding (at a level corresponding toown anatomical images, were then resampled at 23232 half the approximate mean intensity of white matter),mm using cubic spline interpolation and transformed to a manual tracing, and Talairach coordinate locations.Talairach coordinate frame. Spatially normalized and Spherical ROIs were defined to contain all isotropicallyisotropically blurred (Gaussian kernel, s51 mm) activa- resampled voxels within a distance of 3 mm of the site oftion maps for each subject were then combined to generate peak activation identified in each individual for eachmaps of the intersubject mean fractional signal changes region. The mean DS /S within each sphere for eachand associated mean r values (AFNI software; [19]). The separate emotion and neutral contrast was calculated foruse of the fractional signal change (DS /S) as the dependent every region in each subject and used as an observation inmeasure of activation ensures that the data are internally an analysis of variance. Mean response was compared in areferenced to a baseline condition within each run separ- mixed linear model with sex as the between-subjects factorately, thus making the activation maps insensitive to and with brain region, emotion, and hemisphere as thefluctuations in absolute MR image intensity across runs within-subjects factors. Post hoc comparisons were basedand scanning sessions [9]. on Fisher’s protected least significant difference procedure

Voxels with mean r values equivalent to t.3.4 (P, applied to the least-squares means due to missing observa-0.001, uncorrected) were deemed activated and were tions in some brain regions. Satterthwaite’s approximationincluded in the subsequent cluster analysis. Active regions was used to compute the degrees of freedom associatedon the group average maps were identified by applying this with each t value. To determine the robustness of thethreshold and subsequently defining a cluster to contain results, separate mixed linear models were fitted for eachvoxels connected by at least out-of-plane diagonal neigh- brain region due to heteroscedasticity in the responsebors (26-connectivity) and to have a minimum volume of across regions. All computations were carried out with

364 mm . Voxels were included in the cluster analysis only Procedure Mixed in Version 6.12 of PC-SAS. Statistical

Page 5: Neural substrates of facial emotion processing using fMRI

M.L. Kesler /West et al. / Cognitive Brain Research 11 (2001) 213 –226 217

significance was determined at the 0.05 level throughout 3.54, P50.0008) and sad faces (t(57)54.56, P50.0001).the analysis. For the mean intensity rating, all pairs of emotions differed

For the analysis of the post-fMRI emotion labeling data, significantly with the exception of angry and frightened.mean response was compared by constructing a mixed Subjects gave varied responses to the post-scan ques-linear model in PC-SAS with sex as the between-subjects tion: ‘Did the faces evoke any emotion in you?’ Approxi-factor and emotion as the within-subjects factor. The mately 48% of subjects said ‘yes’, 29% said ‘a little bit’,analysis procedure was otherwise like the procedure and 19% said ‘no’. For the happy condition, almost alldescribed above for ROIs with fMRI data. subjects who responded ‘yes’ or ‘a little bit’ reported that

the evoked emotion was in the same category as theemotion displayed on the face. This association was not

3. Results apparent for the angry condition, the frightened condition,or the sad condition.

3.1. Behavioral data and subject debriefing3.2. fMRI activation mapping

All subjects performed an emotion labeling task imme-diately after the second of two fMRI scanning sessions. Table 2 shows the results of a cluster analysis performedEvery face that had been shown during the four emotion on the group mean correlation map comparing neutralruns was singly presented. Subjects were asked to categor- faces to scrambled images (derived from four runs perize each face as neutral, happy, sad, angry, afraid, ‘other,’ subject321 subjects584 runs). The table lists clustersor ‘don’t know’ and to rate the intensity of each expression within the brain that had a significantly higher imageon a five-point scale. Three analysis-of-variance F-tests intensity in the neutral than in the scrambled condition and

3based on data from all but the neutral faces showed that that had a volume $64 mm , corresponding to eightemotion category had a significant effect on the percent isotropically resampled voxels. The processing of neutralcorrect (F(3,57)58.14, P,0.0001) and the mean intensity faces engaged bilateral portions of the fusiform gyrus, therating (F(3,57)547.5, P,0.0001). We found no significant amygdala /entorhinal cortex, and the inferior frontal gyrusmain effects of sex or significant sex-by-emotion interac- (see Fig. 2a). In the right hemisphere, neutral facestions (see Table 1). engaged a portion of the superior temporal sulcus (STS)

To further understand the effect of emotion, post hoc and the inferior occipital gyrus. Fig. 3 indicates the clustercomparisons were done with Fisher’s least significant locations and spatial extent on the cortical surface anddifference procedure. Pairwise comparisons indicated no inferomedial regions.significant differences between the percent of angry faces Table 3 shows the results of a similar cluster analysiscorrectly identified and the other emotions, but happy faces performed on the group mean correlation map comparingwere more accurately identified than frightened (t(57)5 each emotion to the neutral face condition. Clusters with a

3volume $64 mm are listed and all showed more activityduring the emotion condition than during the neutral

Table 1 condition. Angry and sad conditions both activatedBehavioral performance during the post-fMRI emotion labeling tasks

fusiform regions (see Fig. 2b,e), and the angry conditionMean percent Mean intensity6S.E.M. evokes more activation in the right hemisphere. The angrycorrect6S.E.M. on a 1–5 scale

and frightened conditions activate the left inferior frontalc,dAngry faces 92.062.1 3.360.12 gyrus.

c c,dFrightened faces 84.863.8 3.460.13 Some areas that did not activate in comparing neutralb,d a,b,dHappy faces 98.860.59 3.760.11c a,b,c faces with scrambled images were nevertheless involved inSad faces 80.763.8 2.760.11

emotional face processing. Angry faces activated a medialNeutral faces 61.663.0 –8.5% of neutrals 1.460.12 region of the superior frontal gyrus and happy facesrated as angry activated medial frontal cortex at or just anterior to the0.3% of neutrals 1.060.0 cingulate sulcus. The group activation was checked byrated as frightened

visual inspection of the corresponding (x,y,z) location on15.4% of neutrals 1.360.06the MRIs of individual subjects to ascertain whether thisrated as happy

9.9% of neutrals 1.560.11 location was within areas of relatively high signal intensityrated as sad as well as in areas that did not show characteristic patterns

Subjects categorized the facial expressions and also rated the intensity of of motion artifact.the expressions on a 1–5 scale, with 1 representing slightly intense and 5 No activation clusters were found in the null condition,extremely intense. Means across subjects are reported, with mean which compared neutral to neutral face processing. Theseintensity computed using only those faces that were placed in the correct

a runs were also used to obtain a separate estimate of the Pemotion category. S.E.M., standard error of the mean. Significantlyb value associated with the r-value threshold used in thedifferent from angry; significantly different from frightened;

c dsignificantly different from happy; significantly different from sad. cluster analysis. While the predicted probability level

Page 6: Neural substrates of facial emotion processing using fMRI

218 M.L. Kesler /West et al. / Cognitive Brain Research 11 (2001) 213 –226

Table 2aAnatomical locations and characteristics of clusters showing more activity in the neutral face condition than in the scrambled image condition

Region name Neutral versus scrambled3Brodmann area Peak location (in mm) Peak t value Mean % signal change Cluster size (mm )

at peakx y z

Right fusiform gyrus 37 39 247 212 12.1 0.74 1252820 37 231 210 4.8 0.19 120

Left fusiform gyrus 37 239 253 212 12.4 0.40 280819/37 237 265 28 5.0 0.17 104

Right anteromedial temporal lobe 17 27 28 7.2 0.32 712(amygdala /entorhinal cortex)Right superior temporal sulcus 21/22 55 245 14 5.1 0.13 560Right inferior occipital and lingual gyrii 18 31 293 0 5.2 0.75 480Left anteromedial temporal lobe 217 29 28 5.2 0.25 352(amygdala /entorhinal cortex)Right inferior frontal gyrus 44/45 49 25 24 5.1 0.11 280

8/44 39 11 34 4.6 0.11 120Right superior temporal sulcus 21/22 45 233 6 5.0 0.14 280Right fusiform (temporal) 20 33 27 228 4.9 0.24 216Left inferior frontal gyrus 44/45 247 29 6 4.7 0.12 176Right inferior frontal gyrus 47 39 27 212 4.3 0.27 128Left lingual gyrus 18 215 295 28 4.3 0.61 104Right angular gyrus 39 51 265 32 4.9 0.46 104Left inferior frontal gyrus 47 239 23 214 4.5 0.19 72a Voxels of the group mean activation map with a t value.3.4 (P,0.001, uncorrected) were included in the cluster analysis. The list is ranked by clustersize. Locations are in the coordinate system of Talairach and Tournoux [68]. The corresponding Brodmann areas are approximate.

(based on the number of time-series images acquired, the we performed post hoc comparisons in which the inferiornumber of subjects, and the effect of spatial smoothing) frontal gyrus and the medial frontal region showed signifi-was 0.001, a relative frequency analysis of the group mean cant differences between the emotions. In the inferiornull data set indicated an even lower probability of 0.0001 frontal gyrus region, the mean activation for frightenedfor the Type I error. was 0.24% (60.02 S.E.M.) greater than for happy and

0.21% (60.025 S.E.M.) greater than for sad. In the medial3.3. Region of interest analysis frontal region, the mean activation for angry was 0.21%

(60.02 S.E.M.) greater than for frightened. The region-by-To test for differences in activation between the emo- emotion interaction is represented in Fig. 4.

tions, an automated region of interest (ROI) analysis was We also performed post hoc comparisons to furtherperformed (see Section 2.7). Five ROIs, positioned on the understand the sex-by-emotion-by-hemisphere interaction.fusiform gyrus, the inferior frontal gyrus, the precentral In men, angry faces elicited more activation than did happysulcus, the medial part of the superior frontal gyrus, and faces in both the left and right hemispheres. Sad facesthe superior temporal sulcus in both the left and right caused more activation than happy faces in the lefthemispheres, were included. A mixed linear model for an hemisphere, and angry faces showed more activation thanunbalanced repeated measures experiment with one be- sad faces in the right hemisphere. These differences weretween subjects factor: sex, and three within subjects typically on the order of 0.2%. Women showed no signifi-factors: hemisphere, region, and emotion, was used to cant differences between the emotions in either the right oranalyze responses for the endpoint: percent signal change. the left hemisphere.We found a significant main effect of region (F(4,74.1)5 The results of the analyses performed on each brain3.96, P50.006), a region-by-emotion interaction region separately were consistent with the results above, in(F(12,221)52.26, P50.01), and a sex-by-emotion-by- that the inferior frontal gyrus and the medial frontal regionhemisphere interaction (F(3,79.3)53.35, P50.02). each showed a significant main effect of emotion

To further understand the region-by-emotion interaction, (F(3,49.2)54.11, P50.01 and F(3,72.5)52.80, P50.04,

Fig. 2. Group mean fractional signal change maps (21 subjects) are shown for the neutral with scrambled contrast and each of the four emotion conditionsin comparison to neutral. Colored pixels activated at a threshold of P,0.001 and are overlaid on axial group mean anatomy images. The right side of theimages corresponds to the left side of the brain, and images are in the coordinate space of Talairach and Tournoux [68]. Axial sections from left to rightcorrespond to z5212, z50, z512, and z524 mm. In all figures, the group mean signal change maps were smoothed using an isotropic Gaussian blur withs52 mm for the purpose of display.

Page 7: Neural substrates of facial emotion processing using fMRI

M.L. Kesler /West et al. / Cognitive Brain Research 11 (2001) 213 –226 219

Page 8: Neural substrates of facial emotion processing using fMRI

220 M.L. Kesler /West et al. / Cognitive Brain Research 11 (2001) 213 –226

Fig. 3. Rendering of sites of activation determined by the cluster analyses; left and right cortical and inferomedial surfaces are shown (adapted fromDuvernoy [24]). Green, neutral versus scrambled; red, angry; purple, frightened; yellow, happy; blue, sad versus neutral.

Table 3aAnatomical locations and characteristics of clusters showing more brain activation in an emotion condition than in the neutral face condition

3Region name Brodmann area Peak location (in mm) Peak t value Mean % signal change Cluster size (mm )at peak

x y z

AngryRight fusiform gyrus 19 41 273 212 6.3 1.3 480

37/19 37 261 214 5.3 0.31 224Right lateral occipital gyrus 18 33 283 12 4.6 0.34 176Left inferior frontal gyrus 45/47 245 23 24 4.6 0.32 120Left fusiform gyrus 37/19 239 261 212 4.1 0.23 72Left precentral sulcus 44/6 245 5 20 4.6 0.20 64Superior frontal gyrus (medial) 9 5 53 26 4.4 0.32 64

FrightenedLeft inferior frontal gyrus 47 237 21 214 4.7 0.54 184

HappyMedial frontal /cingulate sulcus 32/10 1 49 0 4.5 0.88 384

SadLeft fusiform gyrus 19 227 269 212 4.1 0.29 72

NullNo clusters founda Voxels with a t value.3.4 (P,0.001, uncorrected) were included in the cluster analysis. Locations are in the coordinate system of Talairach andTournoux [68]. The corresponding Brodmann areas are approximate.

Page 9: Neural substrates of facial emotion processing using fMRI

M.L. Kesler /West et al. / Cognitive Brain Research 11 (2001) 213 –226 221

Fig. 4. Mean percent signal change for each emotion condition in the regions of interest. The data from the left and right homologous ROIs are combinedin this figure, since few statistically significant differences between the hemispheres were seen. Column height indicates the least-squares mean of allsubjects, and bars indicate the standard error of the mean (S.E.M.). In the inferior frontal gyrus region, the frightened condition showed significantly greateractivation than did the happy condition (t(292)52.61, P50.009) or the sad condition (t(292)52.29, P50.02). In the medial frontal region, the angrycondition showed significantly greater activation than did the frightened condition (t(225)52.47, P50.014). For the regions examined, the following is alist of the Talairach coordinate locations (x, y, z) of the peaks on the overall group image: left and right fusiform gyri (38, 58, 215 and 241, 51, 216), leftand right inferior frontal gyri (41, 217, 216 and 243, 219, 214), left and right precentral sulci (51, 211, 28 and 245, 27, 24), left and right medialregions of the superior frontal gyri (9, 253, 24 and 25, 251, 26), and left and right superior temporal sulci (53, 41, 4 and 249, 33, 4).

respectively). One additional effect, not seen in the previ- identified a fusiform face area located in the mid-fusiformous analysis, was a main effect for hemisphere in the gyrus near the occipital temporal junction [5,18,34,36,superior temporal sulcus ((F(1,76)57.28, P50.009)); acti- 39,48,61,62,71]. Our finding of superior temporal sulcusvation on the right was greater than on the left. No other activation is also consistent with previous research [39].ROI showed a significant main effect of hemisphere or a While prior studies have found amygdala activation inhemisphere-by-emotion interaction. response to emotional and particularly fearful facial ex-

pressions, our finding of amygdala activation duringneutral face processing suggests that the amygdala may be

4. Discussion responsive to faces per se. This is consistent with singlecell recording studies in monkeys that have shown face-

This study was undertaken to identify the brain regions selective neurons in this region [46,64]. In studies of facethat underlie the conscious perception of four basic facial perception in monkeys, Rolls [64] reported that popula-emotions. To determine this, we first sought to distinguish tions of neurons in the temporal lobe and amygdala areregions associated with the processing of emotionally responsive to differences between faces. It could be thatneutral faces. We found that the following areas activated the subtleties in expression that caused participants toabove threshold in comparisons of neutral faces with misclassify some of the neutral faces as expressing anscrambled images: the right and left fusiform gyri, the right emotion also activated regions devoted to discerningand left amygdalae /entorhinal cortices, the right superior differences between faces.temporal sulcus, the right inferior occipital gyrus, the right In what follows, we discuss regions that showed greaterand left inferior frontal gyri, the right angular gyrus, and activation during the processing of specific facial emotionsthe right and left lingual gyri (see Figs. 2, 3 and Table 2). than in the processing of neutral faces. In many but not allMany of these regions have been identified in past studies cases, these areas overlap with regions activated in theof face processing. In particular, several studies have neutral versus scrambled comparison.

Page 10: Neural substrates of facial emotion processing using fMRI

222 M.L. Kesler /West et al. / Cognitive Brain Research 11 (2001) 213 –226

4.1. Activation in the sad condition 4.3. Activation in the happy condition

We found one small cluster of activation in the left We saw significant activation in the medial frontal /fusiform gyrus in the comparison of sad with neutral. The cingulate sulcus region (Brodmann areas 32/10) in theonly other emotion that elicited significant left fusiform happy versus neutral face comparison. At the thresholdactivation in comparison with neutral face processing was used, this area did not show activation in response to otheranger. However, activity in the fusiform gyri appeared to emotions or in the neutral versus scrambled comparison.be a feature common to all four emotions when the The peak location of this cluster was inferior to the medialthreshold was lowered somewhat (to t52.6, P¯0.01). Our frontal peak location seen in the angry condition. PreviousROI analysis did not find evidence for differences between investigators have found medial frontal or cingulate gyrusthe emotions in this region, perhaps because this analysis activity in response to happy faces [23,57], but others havefocused on the height of peak activations and not on the found medial frontal activity (areas 24, 32) caused byspatial spread of activations. The relative lack of activation differing facial emotions [8,32,49,59]. Since this area wasin general in our sad condition parallels the results of not included in our ROI analysis because of search areaPhillips et al. [57], who found no activation associated overlap with the medial superior frontal gyrus region, ourwith sad facial expressions. In our experiment, participants data do not address the issue of differential activation byrated sad expressions as less intense and categorized them the emotions. We speculate that medial frontal activity mayless accurately than the other emotions. This may have occur in situations in which happy expressions are overtlycontributed to the results. processed. In support of this, Phillips et al. [57] found

anterior cingulate activation in association with an emo-tional versus neutral face discrimination task. Because a

4.2. Activation in the frightened condition high percentage of our subjects reported feeling mildlyhappy when viewing happy faces, the observed activation

In comparing frightened to neutral faces, we found may have been an experiential effect. This is consistentactivation in the left inferior frontal gyrus (Brodmann area with the results of Lane et al. [41], who found that the47). This region was also active in neutral versus scram- anterior cingulate is involved in representing subjectivebled and angry versus neutral comparisons. Our ROI emotional response.analysis showed that this region is significantly moreengaged when processing fearful expressions than whenprocessing happy or sad expressions (see Fig. 4). This 4.4. Activation in the angry conditionfinding is consistent with data suggesting that the inferiorfrontal gyrus interacts with the amygdala [14,21,49,51]. We Angry faces caused the most widespread activationexpected to find amygdala activation in response to pattern of the four emotions. This effect may have been thefrightened faces, but our subjects were exposed for a result of heightened response by men, but further researchrelatively long time (four 36-s blocks). Past research is needed to confirm this. Regions that generated activitysuggests that the amygdala rapidly habituates to emotional- over and above that observed during neutral face process-ly charged displays [14]. This result may also reflect the ing include the right and left fusiform gyri and the leftfact that we found amygdala activation in the neutral inferior frontal gyrus. Angry face perception also engagedversus scrambled comparison netting no activation differ- the left precentral sulcus, the medial portion of the superiorences in the ‘cognitive’ subtraction. frontal gyrus, and the right lateral occipital gyrus. These

Previous studies have associated the inferior frontal regions did not reach threshold in neutral and scrambledregion with emotional face processing [23,30,58]. comparisons, but the left precentral sulcus and the rightSprengelmeyer et al. [67] found left inferior frontal activity lateral occipital gyrus are immediately adjacent to areas(within about 1 cm of our inferior frontal clusters) in that did, suggesting that neutral and angry face processingresponse to frightened, angry, and disgusted faces. The may be topographically contiguous.putative homologous region in the macaque brain contains Although we found lateral occipital gyrus and medialface-selective neurons [56], and these cells receive input frontal activation solely during angry versus neutral facefrom the STS, an area involved in neutral face processing contrasts, it would be problematic to conclude that thesein our study. Sprengelmeyer et al. [67] propose that areas are selectively involved in angry face perception. ForBrodmann area 47 is an endpoint along the posterior- example, several studies have implicated the lateral occipi-anterior visual processing stream that serves to integrate tal gyrus in face and object recognition, and Gauthier et al.behavioral information contained in facial expressions. [29] postulate that it performs both a top-down andTaken together, our data and theirs suggest that this region bottom-up analysis of shape. Lang et al. [44] found rightmay respond to negative emotions characterized by rela- and left lateral occipital activation when subjects viewedtively high levels of arousal. pleasant and unpleasant versus neutral pictures. Imaizumi

Page 11: Neural substrates of facial emotion processing using fMRI

M.L. Kesler /West et al. / Cognitive Brain Research 11 (2001) 213 –226 223

et al. [37] found activation in the right lateral occipital activation may result from increased attention to emotionalgyrus (approximately 9 mm from our peak location) when aspects of the stimuli via back projections from othersubjects were asked to identify the emotional intonation of regions, such as the amygdala and portions of the temporalspoken words, as compared with identifying the speaker. lobe. This explanation seems a more parsimonious one,Taken together, these data suggest that the lateral occipital given our results.gyrus may be important in deciphering the significance ofstimuli presented in both visual and auditory modalities. 4.6. Lateralization of responses

There is also evidence that the medial frontal region isinvolved in a variety of emotion-related operations In support of our laterality hypothesis, the right fusiform[31,42,43]. Teasdale et al. [69] hypothesize that this region gyrus, the right superior temporal sulcus, the right middleis associated with the processing of affect-laden meanings frontal gyrus, and the right angular gyrus were moregenerally and not with particular types of emotion. The engaged than homologous regions in the left hemisphereresults of our ROI analysis modify this postulate by during neutral face processing (see Table 2). Moreover,suggesting that this region may be more active in process- fusiform and lateral occipital regions were more involveding angry faces than frightened faces (see Fig. 4). on the right in the angry condition (Table 3). Contrary to

Few neuroimaging studies have examined angry face our laterality hypothesis, inferior frontal regions were moreperception specifically. Consistent with our results, engaged in the left hemisphere. This was not associatedSprengelmeyer et al. [67] found left inferior frontal activity with viewing positive emotion as the valence hypothesis of(area 47) in response to angry faces. Blair et al. [8] found lateralization of emotion would predict [66]. However,activity in area 47 of the right hemisphere and in area 32 making confident judgments about lateralization without(anterior cingulate) in response to angry faces of increasing statistical tests is problematic. Therefore, we includedintensity. Although in our study angry faces were not rated ‘hemisphere’ as a factor in the ROI analysis. The superioras more intense than any of the other emotions except sad temporal sulcus was the only region of those examined to(see Table 1), we cannot rule out the possibility that angry show a significant effect of hemisphere (R.L).faces were more arousing.

4.7. Sex differences in emotion processing

4.5. The fusiform face areaWe found greater activation in the right and left hemi-

spheres of men while they viewed angry faces than whileWe found fusiform activation in neutral versus scram-they viewed happy faces. We also found that men’s rightbled, angry versus neutral, and sad versus neutral com-hemispheres showed greater activation during angry faceparisons. Neither frightened nor happy versus neutralperception than during sad face perception and that theircontrasts evoked fusiform activation above threshold.left hemispheres showed greater activation to sad facesGauthier et al. [29] have found that mid-fusiform activa-than to happy faces. No such differences between thetion may be related to expertise in recognizing distinctionsemotions in either hemisphere existed among women.between subordinates in a class, rather than to faceOther studies have shown that women tend to be moreprocessing per se. When applied to our data, this impliesaccurate at decoding basic facial emotions than men, withthat angry and sad faces require greater expertise tothe possible exception of anger [15]. Our results suggestprocess than do the other emotions. However, post-ex-that men may recruit more neurons when processingperimental behavioral testing showed that both angry andagonistic displays than they recruit when processinghappy face perception were characterized by a high degreeaffiliative emotions. This would make sense, given selec-of accuracy while sad faces were less readily identifiedtive pressures for males to recognize potentially threaten-(see Table 1). If activation of the fusiform gyrus is relateding individuals and to defend territory.to the development of expertise in categorizing novel

visual stimuli, as evidenced by a higher percentage ofsubjects recognizing the emotion and contributing to the 4.8. Methodological considerationseffect, then we should have witnessed comparable fusiformactivation during happy versus neutral face processing. Several factors may influence our effectiveness inConversely, if greater difficulty in classifying a facial comparing the emotions. First, we presented a singleemotion increases fusiform activation, then one would not emotion in each EPI run and contrasted the emotionalexpect angry faces to cause significant activation in this condition to the neutral condition. Presenting two emotionsregion. within the same run may increase the power to detect

Breiter et al. [14] found that both frightened and happy differences specifically between those two emotions, sinceexpressions elicited increased fusiform activation in com- the variance would not be inflated by fluctuations in theparisons with neutral faces. They suggested that fusiform response during the baseline condition of viewing emotion-

Page 12: Neural substrates of facial emotion processing using fMRI

224 M.L. Kesler /West et al. / Cognitive Brain Research 11 (2001) 213 –226

ally neutral faces. Second, many factors affect the ob- Acknowledgementsserved signal-to-noise ratio and influence our ability tocompare and contrast the emotions. For example, habitua- These studies were supported by NSF Grant IBN-tion to an emotion over the course of a run may decrease 9604231. We thank Robin Avison, Sherry C. Williams,the percent signal change associated with the emotion Aileen Wiglesworth, Xia Wang, Derek Mace, and Martapresented. It is also possible that each emotion-specific run Mendiondo for their technical assistance.influenced the processing of neutral expressions differen-tially. While such effects are possible, the primary taskconsisted of affect perception, judgment, and knowledge

Referencesrather than mood induction. As such, we expect that theseeffects, if present, were weak.

[1] R. Adolphs, H. Damasio, D. Tranel, G. Cooper, A.R. Damasio, AFrom a methodological perspective, it is noteworthy thatrole for somatosensory cortices in the visual recognition of emotion

39% of faces that had been validated as neutral in our revealed by three-dimensional lesion mapping, J. Neurosci. 20laboratory and by prior investigators (Ekman and Friesen (2000) 2683–2690.

[2] R. Adolphs, H. Damasio, D. Tranel, A.R. Damasio, Cortical systems[25], Bowers et al. [13]) were classified as expressing anfor the recognition of emotion in facial expressions, J. Neurosci. 16emotion in post-fMRI behavioral testing (see Table 1).(1996) 7678–7687.While it is possible that these ambiguous neutrals may

[3] R. Adolphs, D. Tranel, H. Damasio, A. Damasio, Impaired recogni-have influenced the activation pattern observed using the tion of emotion in facial expressions following bilateral damage tosubtraction technique, we consider this unlikely given that the human amygdala, Nature 372 (1994) 669–672.the mean intensity levels of these misclassified expressions [4] R. Adolphs, D. Tranel, H. Damasio, A.R. Damasio, Fear and the

human amygdala, J. Neurosci. 15 (1995) 5879–5891.were at least eight standard deviations below the mean[5] T. Allison, H. Ginter, G. McCarthy, A.C. Nobre, A. Puce, A. Belger,intensity ratings of the intended angry, frightened, happy,

Face recognition in human extrastriate cortex, J. Neurophysiol. 71and sad faces. Furthermore, the distribution of the 39% of (1994) 821–825.ambiguous neutrals within and between the different runs [6] A.A. Baird, S.A. Gruber, D.A. Fein, L.C. Maas, R.J. Steingard, P.F.would have been arbitrary and therefore unlikely to have Renshaw, B.M. Cohen, D.A. Yurgelun-Todd, Functional magnetic

resonance imaging of facial affect recognition in children andinfluenced our results in any systematic way.adolescents, J. Am. Acad. Child Adolesc. Psychiatry 38 (1999)195–199.

[7] P.A. Bandettini, A. Jesmanowicz, E.C. Wong, J.S. Hyde, Processing4.9. Conclusion strategies for time-course data sets in functional MRI of the human

brain, Magn. Reson. Med. 30 (1993) 161–173.[8] R.J.R. Blair, J.S. Morris, C.D. Frith, D.I. Perrett, R.J. Dolan,One of the strengths of neuroimaging studies of facial

Dissociable neural responses to facial expressions of sadness andemotion perception is that they allow investigators toanger, Brain 122 (1999) 883–893.‘subtract’ the effects of neutral face processing from those

[9] A.M. Blamire, S. Ogawa, K. Ugurbil, D. Rothman, G. McCarthy,of emotional face processing. From our contrast of neutral J.M. Ellerman, F. Hyder, Z. Rattner, R.G. Schulman, Dynamicwith scrambled images, we learned that multiple brain mapping of the human visual cortex by high-speed magnetic

resonance imaging, Proc. Natl. Acad. Sci. USA 89 (1992) 11069–regions are involved in processing the complex features of11073.human faces. Furthermore, in a subset of these brain

[10] L.X. Blonder, D. Bowers, K.M. Heilman, The role of the rightregions, the perception of an emotion on the face generateshemisphere in emotional communication, Brain 114 (1991) 1115–

increased activity. It is in the processing of faces in general 1127.that we found indications of right hemisphere specializa- [11] J.C. Borod, E. Koff, M.P. Lorch, M. Nicholas, Channels of

emotional expression in patients with unilateral brain damage, Arch.tion, particularly in the superior temporal sulcus, althoughNeurol. 42 (1985) 345–348.there is some suggestion that angry faces may preferen-

[12] D. Bowers, R. Bauer, H. Coslett, K. Heilman, Dissociation betweentially engage right hemisphere regions.the processing of affective and nonaffective faces in patients with

We also showed that emotional facial expressions unilateral brain lesions, Brain Cogn. 4 (1985) 258–272.engage regions not activated by neutral faces. These [13] D. Bowers, L.X. Blonder, K.M. Heilman, The Florida Affect

Battery, Department of Neurology, Gainesville, FL, 1991.include the anterior cingulate sulcus (happy faces) and a[14] H.C. Breiter, N.L. Etcoff, P.J. Walen, W.A. Kennedy, S.L. Rauch,medial region of the superior frontal gyrus (angry faces).

R.L. Buckner, M.M. Stauss, S.E. Hyman, B.R. Rosen, Response andOther studies have implicated these medial frontal regionshabituation of the human amygdala during visual processing of

in the processing of several emotions. Last, we found that facial expression, Neuron 17 (1996) 875–887.men have a differential neural response depending upon [15] L.R. Brody, J.A. Hall, Gender and emotion, in: M. Lewis, J.M.

Haviland (Eds.), Handbook of Emotions, Guilford, New York, 1993,the emotion presented. Future studies will investigatepp. 447–460.further the effects of sex and emotion category on patterns

[16] A.E.B. Cancelliere, A. Kertesz, Lesion localization in acquiredof brain activation. Additional experiments are also neededdeficits of emotional expression and comprehension, Brain Cogn. 13

to determine whether differences in the degree of arousal (1990) 133–147.triggered by the emotions contribute to variations in neural [17] A.J. Calder, A.W. Young, D. Rowland, D. Perrett, J.R. Hodges, N.L.response. Etcoff, Facial emotion recognition after bilateral amygdala damage:

Page 13: Neural substrates of facial emotion processing using fMRI

M.L. Kesler /West et al. / Cognitive Brain Research 11 (2001) 213 –226 225

differentially severe impairment of fear, Cognit. Neuropsychol. 13 [39] N. Kanwisher, J. McDermott, M.M. Chun, The fusiform face area: a(1996) 699–745. module in human extrastriate cortex specialized for face perception,

J. Neurosci. 17 (1997) 4302–4311.[18] S.M. Courtney, L.G. Ungerleider, K. Keil, J.V. Haxby, Object andspatial visual working memory activate separate neural systems in [40] E.S. Keeping, Introduction to Statistical Inference, Dover, Newhuman cortex, Cereb. Cortex 6 (1996) 39–49. York, 1995.

[19] R.W. Cox, AFNI: Software for analysis and visualization of func- [41] R.D. Lane, G.R. Fink, P.M.L. Chau, R.J. Dolan, Neural activationtional magnetic resonance neuroimages, Comput. Biomed. Res. 29 during selective attention to subjective emotional responses, Neuro-(1996) 162–173. report 8 (1997) 3969–3972.

[20] H. Critchley, E. Daly, M. Phillips, M. Brammer, E. Bullmore, S. [42] R.D. Lane, E.M. Reiman, G.L. Ahern, G.E. Schwartz, R.J. David-Williams, T. Van Amelsvoort, D. Robertson, A. David, D. Murphy, son, Neuroanatomical correlates of happiness, sadness, and disgust,Explicit and implicit neural mechanisms for processing of social Am. J. Psychiatry 157 (1997) 926–933.information from facial expressions: a functional magnetic reso- [43] R.D. Lane, E.M. Reiman, M.M. Bradley, P.J. Lang, G.I. Ahern, R.J.nance imaging study, Hum. Brain Mapp. 9 (2000) 93–105. Davidson, G.E. Schwartz, Neuroanatomical correlates of pleasant

[21] R.J. Davidson, S.K. Sutton, Affective neuroscience: the emergence and unpleasant emotion, Neuropsychologia 35 (1997) 1437–1444.of a discipline, Curr Opin. Neurobiol. 5 (1995) 217–224. [44] P.J. Lang, M.M. Bradley, J.R. Fitzsimmons, B.N. Cuthbert, J.D.

[22] S. DeKosky, K.M. Heilman, D. Bowers, E. Valenstein, Recognition Scott, B. Moulder, V. Nangia, Emotional arousal and activation ofand discrimination of emotional faces and pictures, Brain Lang. 9 the visual cortex: an fMRI analysis, Psychophysiology 35 (1998)(1980) 206–214. 199–210.

[23] R.J. Dolan, P. Fletcher, J. Morris, N. Kapur, J.F.W. Deakin, C.D. [45] J.E. LeDoux, Emotion: clues from the brain, Annu. Rev. Psychol. 46Frith, Neural activation during covert processing of positive emo- (1995) 209–235.tional facial expressions, Neuroimage 4 (1996) 194–200. [46] C.M. Leonard, E.T. Rolls, F.A.W. Wilson, G.C. Baylis, Neurons in

[24] H. Duvernoy, The Human Brain, Springer, New York, 1991. the amygdala of the monkey with responses selective for faces,Behav. Brain Res. 15 (1985) 159–176.[25] P. Ekman, W.V. Friesen, Unmasking the Face, Prentice-Hall, Engle-

wood Cliffs, NJ, 1975. [47] R. Ley, M. Bryden, Hemispheric differences in recognizing facesand emotions, Brain Lang. 7 (1979) 127–138.[26] N.L. Etcoff, Perceptual and conceptual organization of facial

emotions: hemispheric differences, Brain Cogn. 3 (1984) 385–412. [48] G. McCarthy, A. Puce, J.C. Gore, T. Allison, Face-specific process-ing in the human fusiform gyrus, J. Cogn. Neurosci. 9 (1997)[27] I. Fried, C. Mateer, G. Ojemann, R. Wohns, P. Fedio, Organization605–610.and visuospatial functions in human cortex: evidence from electrical

stimulation, Brain 105 (1982) 349–371. [49] J.S. Morris, K.J. Friston, C. Buchel, C.D. Frith, A.W. Young, A.J.Calder, R.J. Dolan, A neuromodulatory role for the human amygdala[28] K.J. Friston, J. Ashburner, J.B. Poline, C.D. Frith, J.D. Heather,in processing emotional facial expressions, Brain 121 (1998) 47–57.R.S.J. Frackowiak, Spatial registration and normalization of images,

Hum. Brain Mapp. 2 (1995) 165–189. [50] J.S. Morris, K.J. Friston, R.J. Dolan, Neural responses to salientvisual stimuli, Proc. R. Soc. Lond. B 264 (1997) 769–775.[29] I. Gauthier, M.J. Tarr, A.W. Anderson, P. Skudlarksi, J.C. Gore,

Activation of the middle fusiform ‘face area’ increases with [51] J.S. Morris, C.D. Frith, D.I. Perrett, D. Rowland, A.W. Young, A.J.expertise in recognizing novel objects, Nat. Neurosci. 2 (1999) Calder, R.J. Dolan, A differential neural response in the human568–573. amygdala to fearful and happy facial expressions, Nature 383 (1996)

812–815.[30] M.S. George, T.A. Ketter, D.S. Gill, J.V. Haxby, L.G. Ungerleider, P.Herscovitch, R.M. Post, Brain regions involved in recognizing facial [52] J.S. Morris, A. Ohman, R.J. Dolan, A subcortical pathway to theemotion or identity: an oxygen-15 PET study, J. Neuropsychiatry 5 right amygdalal mediating ‘unseen’ fear, Proc. Natl. Acad. Sci. USA(1993) 384–394. 96 (1999) 1680–1685.

[31] M.S. George, T.A. Ketter, P.I. Parekh, P. Herscovitch, R.M. Post, [53] R.D. Morris, W.D. Hopkins, Perception of human chimeric faces byGender differences during transient self-induced sadness or happi- chimpanzees: evidence for a right hemisphere advantage, Brainness, Biol. Psychiatry 40 (1996) 859–871. Cogn. 21 (1993) 111–122.

[32] M.S. George, T.A. Ketter, P.I. Parekh, B. Horwitz, P. Herscovitch, [54] M. Natale, R.E. Gur, R.C. Gur, Hemispheric asymmetries inR.M. Post, Brain activity during transcient sadness and happiness in processing emotional expressions, Neuropsychologia 21 (1983)healthy women, Am. J. Psychiatry 152 (1995) 341–351. 555–565.

[33] R.C. Gur, B.E. Skolnick, R.E. Gur, Effects of emotional discrimina- [55] J.G. Ojemann, G.A. Ojemann, E. Lettich, Neuronal activity relatedtion tasks on cerebral blood flow: regional activation and its relation to faces and matching in the human right nondominant temporalto performance, Brain. Cogn. 25 (1994) 271–286. cortex, Brain 115 (1992) 1–13.

[34] E. Halgren, A.M. Dale, M.I. Sereno, R.B.H. Tootell, K. Marinkovic, [56] S.P. O’Scalaidhe, F.A.W. Wilson, P.S. Goldman-Rakic, Areal segre-B.R. Rosen, Location of human face-selective cortex with respect to gation of face-processing neurons in prefrontal cortex, Science 278retinotopic areas, Hum. Brain Mapp. 7 (1999) 29–37. (1997) 1135–1138.

[35] C.R. Hamilton, B.Vermeire, Complementary hemispheric specializa- [57] M.L. Phillips, E.T. Bullmore, R. Howard, P.W.R. Woodruff, I.C.tion in monkeys, Science 24 (1988) 1691–1694. Wright, S.C.R. Williams, A. Simmons, C. Andrew, M. Brammer,

A.S. David, Investigations of facial recognition memory and happy[36] J.V. Haxby, C.L. Grady, B. Horwitz, L.G. Ungerleider, M. Mishkin,and sad facial expression perception: an fMRI study, Psychiatry Res.R.E. Carson, P. Herscovitch, M.B. Schapiro, S.I. Rapoport, Dis-Neuroimaging 83 (1998) 127–138.sociation of object and spatial visual processing pathways in human

extrastriate cortex, Proc. Natl. Acad. Sci. USA 188 (1991) 1621– [58] M.L. Phillips, A.W. Young, S.K. Scott, A.J. Calder, C. Andrew, V.1625. Giampietro, S.C.R. Williams, E.T. Bullmore, M. Brammer, J.A.

Gray, Neural responses to facial and vocal expressions of fear and[37] S. Imaizumi, K. Mori, S. Kiritani, R. Kawashima, M. Sugiura, H.disgust, Proc. R. Soc. Lond. B 83 (1998) 1809–1817.Fukuda, K. Itoh, T. Kato, A. Nakamura, K. Hatano, S. Kojima, K.

Nakamura, Vocal identification of speaker and emotion activates [59] M.L. Phillips, A.W. Young, C. Senior, M. Brammer, C. Andrew, A.J.different brain regions, Neuroreport 8 (1997) 2809–2812. Calder, E.T. Bullmore, D.I. Perrett, D. Rowland, S.C.R. Williams,

J.A. Gray, A.S. David, A specific neural substrate for perceiving[38] C. Izard, S. Buechler, Aspects of consciousness and personality infacial expressions of disgust, Nature 389 (1997) 495–498.terms of differential emotions theory, in: R. Plutchik, H. Kellerman

(Eds.), Theories of Emotion, Emotion: Theory, Research and [60] S. Pinker, On language (Interview), J. Cogn. Neurosci. 6 (1994)Experience, Vol. 1, Academic Press, New York, 1980, pp. 165–187. 91–96.

Page 14: Neural substrates of facial emotion processing using fMRI

226 M.L. Kesler /West et al. / Cognitive Brain Research 11 (2001) 213 –226

[61] A. Puce, T. Allison, M. Asgari, J.C. Gore, G. McCarthy, Differential [68] J. Talairach, P. Tournoux, Co-Planar Stereotaxic Atlas of the Humansensitivity of human visual cortex to faces, letterstrings, and Brain, Thieme Medical, New York, 1988.textures: a functional magnetic resonance imaging study, J. Neuro- [69] J.D. Teasdale, R.J. Howard, S.G. Cox, Y. Ha, M.J. Brammer, S.C.sci. 16 (1996) 5205–5215. Williams, S.A. Checkley, Functional MRI study of the cognitive

[62] A. Puce, T. Allison, J.C. Gore, G. McCarthy, Face-sensitive regions generation of affect, Am. J. Psychiatry 156 (1999) 209–215.in human extrastriate cortex studied by function MRI, J. Neuro- [70] P.J. Whalen, S.L. Rauch, N.L. Etcoff, S.C. McInerney, M.B. Lee,physiol. 74 (1995) 1192–1199. M.A. Jenike, Masked presentations of emotional facial expressions

[63] S.Z. Rapcsak, J.F. Comer, A.B. Rubens, Anomia for facial expres- modulate amygdala activity without explicit knowledge, J. Neurosci.sions: neurpsychological mechanisms and anatomical correlates, 18 (1998) 411–418.Brain Lang. 45 (1993) 233–252. [71] E. Wojciulik, N. Kanwisher, J. Diver, Covert visual attention

[64] E.T. Rolls, Neurons in the cortex of the temporal lobe and in the modulates face-specific activity in the human fusiform gyrus: fMRIamygdala of the monkey with responses selective for faces, Hum. study, J. Neurophysiol. 79 (1998) 1574–1578.Neurobiol. 3 (1984) 209–222. [72] A.W. Young, J.P. Aggleton, D.J. Hellawell, M. Johnson, P. Broks,

[66] E.K. Silberman, H. Weingartner, Hemispheric lateralization of J.R. Hanley, Face processing impairments after amygdalotomy,functions related to emotion, Brain Cogn. 5 (1986) 322–353. Brain 118 (1995) 15–24.

[67] R. Sprengelmeyer, M. Rausch, U.T. Eysel, H. Przuntek, Neuralstructures associated with recognition of facial expressions of basicemotions, Proc. R. Soc. Lond. B 265 (1998) 1927–1931.