Top Banner
Implicit emotional processing in peripheral vision: Behavioral and neural evidence Simon Rigoulot a,b,n,1 , Fabien D’Hondt a,b,d,2 , Jacques Honore ´ a,b,c , Henrique Sequeira a,b,d,3 a Universite´ de Lille Nord de France, F-59000, France b Neurosciences Fonctionnelles et Pathologies, Universite´ de Lille 2, F-59000, France c Centre National de la Recherche Scientifique (CNRS), F-59000, France. d Neurosciences, Universite´ de Lille 1, F-59000, France article info Article history: Received 8 February 2012 Received in revised form 1 August 2012 Accepted 19 August 2012 Available online 28 August 2012 Keywords: Emotion Peripheral vision Facial expression Implicit Event-related potentials abstract Emotional facial expressions (EFE) are efficiently processed when both attention and gaze are focused on them. However, what kind of processing persists when EFE are neither the target of attention nor of gaze remains largely unknown. Consequently, in this experiment we investigated whether the implicit processing of faces displayed in far periphery could still be modulated by their emotional expression. Happy, fearful and neutral faces appeared randomly for 300 ms at four peripheral locations of a panoramic screen (15 and 301 in the right and left visual fields). Reaction times and electrophysiological responses were recorded from 32 participants who had to categorize these faces according to their gender. A decrease of behavioral performance was specifically found for happy and fearful faces, probably because emotional content was automatically processed and interfered with information necessary to the task. A spatio-temporal principal component analysis of electrophysiological data confirmed an enhancement of early activity in occipito-temporal areas for emotional faces in comparison with neutral ones. Overall, these data show an implicit processing of EFE despite the strong decrease of visual performance with eccentricity. Therefore, the present research suggests that EFE could be automatically detected in peripheral vision, confirming the abilities of humans to process emotional saliency in very impoverished conditions of vision. & 2012 Elsevier Ltd. All rights reserved. 1. Introduction Among environmental stimulations, faces have a privileged status, as first shown by Yarbus (1967) who found that ocular fixations were more likely to be directed towards faces than to any other part of a scene. Indeed, in an evolutionary perspective, quick detection and analysis of facial expressions are seen as an important behavioral advantage (e.g., Darwin, 1872), particularly in social communication (Adolphs, 2003). In agreement with the necessity to process such crucial information, the skills of human observers to make reliable judgments about emotional facial expressions (EFE) persist with poorly informative stimulations, such as faint changes in facial expression (Edwards, 1998) or in experimentally degraded conditions of processing, like in atten- tional blink (De Jong, Koster, van Wees & Martens, 2009), binocular rivalry (Alpers & Gerdes, 2007) or masking (Morris, Ohman, & Dolan, 1998) paradigms. Behavioral and imaging studies have explored the specificities of the implicit processing of EFE and have found that even when emotional cues are not relevant for the task, EFE still modulate the behavioral performance (Fr ¨ uhholz, Jellinghaus, & Herrmann, 2011) and the activity of cerebral structures (e.g., Critchley et al., 2000). Despite task instructions, emotional information can auto- matically capture a portion of the attentional resources (see Fox, Russo, Bowles, & Dutton, 2001 for discussion) and influence performance. In the literature, this influence has been described in two different ways. First, the automatic capture of attention could help the participants in their task and lead to enhanced behavioral performance. This was demonstrated in dot-probe paradigms, when an emotional stimulation in one point of the visual field improved the detection and the identification of a target subsequently appearing in the same location (Armony & Dolan, 2002; Brosch, Sander, & Scherer, 2007; Lipp & Derakshan, 2005). Second, in an opposite way, automatic capture of Contents lists available at SciVerse ScienceDirect journal homepage: www.elsevier.com/locate/neuropsychologia Neuropsychologia 0028-3932/$ - see front matter & 2012 Elsevier Ltd. All rights reserved. http://dx.doi.org/10.1016/j.neuropsychologia.2012.08.015 n Correspondence to: Laboratoire de Neurosciences Fonctionnelles et Pathologies, opital R. Salengro, Service EFV, CHRU de Lille, 59037 Lille Cedex, France. E-mail addresses: [email protected] (S. Rigoulot), [email protected] (F. D’Hondt), [email protected] (J. Honore ´), [email protected] (H. Sequeira). 1 Currently at: McGill University/School of Communication Sciences and Disorders 1266 avenue des Pins Ouest; H3G 1A8, Montreal, QC, Canada; http://www.mcgill.ca/pell_lab/. Tel.: þ1 514 398 8496; fax: þ1 514 398 8123. 2 Tel.: þ0033 0 3 20 44 62 81; fax: þ0033 0 3 20 44 67 32. 3 Tel.: þ0033 (0)3 20 43 69 29. Neuropsychologia 50 (2012) 2887–2896
10

Implicit emotional processing in peripheral vision: Behavioral and neural evidence

May 13, 2023

Download

Documents

Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Implicit emotional processing in peripheral vision: Behavioral and neural evidence

Neuropsychologia 50 (2012) 2887–2896

Contents lists available at SciVerse ScienceDirect

Neuropsychologia

0028-39

http://d

n Corr

Hopital

E-m

fabien.d

henriqu1 Cu

Disorde

http://w2 Te3 Te

journal homepage: www.elsevier.com/locate/neuropsychologia

Implicit emotional processing in peripheral vision: Behavioraland neural evidence

Simon Rigoulot a,b,n,1, Fabien D’Hondt a,b,d,2, Jacques Honore a,b,c, Henrique Sequeira a,b,d,3

a Universite de Lille Nord de France, F-59000, Franceb Neurosciences Fonctionnelles et Pathologies, Universite de Lille 2, F-59000, Francec Centre National de la Recherche Scientifique (CNRS), F-59000, France.d Neurosciences, Universite de Lille 1, F-59000, France

a r t i c l e i n f o

Article history:

Received 8 February 2012

Received in revised form

1 August 2012

Accepted 19 August 2012Available online 28 August 2012

Keywords:

Emotion

Peripheral vision

Facial expression

Implicit

Event-related potentials

32/$ - see front matter & 2012 Elsevier Ltd. A

x.doi.org/10.1016/j.neuropsychologia.2012.08

espondence to: Laboratoire de Neurosciences

R. Salengro, Service EFV, CHRU de Lille, 59037

ail addresses: [email protected] (S. Rig

[email protected] (F. D’Hondt), ja-honore@ch

[email protected] (H. Sequeira).

rrently at: McGill University/School of Co

rs 1266 avenue des Pins Ouest; H3G 1A8, Mo

ww.mcgill.ca/pell_lab/. Tel.: þ1 514 398 849

l.: þ0033 0 3 20 44 62 81; fax: þ0033 0 3 20

l.: þ0033 (0)3 20 43 69 29.

a b s t r a c t

Emotional facial expressions (EFE) are efficiently processed when both attention and gaze are focused

on them. However, what kind of processing persists when EFE are neither the target of attention nor of

gaze remains largely unknown. Consequently, in this experiment we investigated whether the implicit

processing of faces displayed in far periphery could still be modulated by their emotional expression.

Happy, fearful and neutral faces appeared randomly for 300 ms at four peripheral locations of a

panoramic screen (15 and 301 in the right and left visual fields). Reaction times and electrophysiological

responses were recorded from 32 participants who had to categorize these faces according to their

gender. A decrease of behavioral performance was specifically found for happy and fearful faces,

probably because emotional content was automatically processed and interfered with information

necessary to the task. A spatio-temporal principal component analysis of electrophysiological data

confirmed an enhancement of early activity in occipito-temporal areas for emotional faces in

comparison with neutral ones. Overall, these data show an implicit processing of EFE despite the

strong decrease of visual performance with eccentricity. Therefore, the present research suggests that

EFE could be automatically detected in peripheral vision, confirming the abilities of humans to process

emotional saliency in very impoverished conditions of vision.

& 2012 Elsevier Ltd. All rights reserved.

1. Introduction

Among environmental stimulations, faces have a privilegedstatus, as first shown by Yarbus (1967) who found that ocularfixations were more likely to be directed towards faces than toany other part of a scene. Indeed, in an evolutionary perspective,quick detection and analysis of facial expressions are seen as animportant behavioral advantage (e.g., Darwin, 1872), particularlyin social communication (Adolphs, 2003). In agreement with thenecessity to process such crucial information, the skills of humanobservers to make reliable judgments about emotional facialexpressions (EFE) persist with poorly informative stimulations,

ll rights reserved.

.015

Fonctionnelles et Pathologies,

Lille Cedex, France.

oulot),

ru-lille.fr (J. Honore),

mmunication Sciences and

ntreal, QC, Canada;

6; fax: þ1 514 398 8123.

44 67 32.

such as faint changes in facial expression (Edwards, 1998) or inexperimentally degraded conditions of processing, like in atten-tional blink (De Jong, Koster, van Wees & Martens, 2009),binocular rivalry (Alpers & Gerdes, 2007) or masking (Morris,Ohman, & Dolan, 1998) paradigms.

Behavioral and imaging studies have explored the specificitiesof the implicit processing of EFE and have found that even whenemotional cues are not relevant for the task, EFE still modulatethe behavioral performance (Fruhholz, Jellinghaus, & Herrmann,2011) and the activity of cerebral structures (e.g., Critchley et al.,2000). Despite task instructions, emotional information can auto-matically capture a portion of the attentional resources (see Fox,Russo, Bowles, & Dutton, 2001 for discussion) and influenceperformance. In the literature, this influence has been describedin two different ways. First, the automatic capture of attentioncould help the participants in their task and lead to enhancedbehavioral performance. This was demonstrated in dot-probeparadigms, when an emotional stimulation in one point of thevisual field improved the detection and the identification of atarget subsequently appearing in the same location (Armony &Dolan, 2002; Brosch, Sander, & Scherer, 2007; Lipp & Derakshan,2005). Second, in an opposite way, automatic capture of

Page 2: Implicit emotional processing in peripheral vision: Behavioral and neural evidence

S. Rigoulot et al. / Neuropsychologia 50 (2012) 2887–28962888

emotional information could disrupt the processing of informa-tion relevant to the task. This is the case in emotional Stroop tasks(Fruhholz et al., 2011; Williams, Mathews, & MacLeod, 1996) orwhen the occurrence of an emotional item delayed responses tothe target, which has been interpreted as reflecting a difficulty ofdisengaging from emotional information (Fox et al., 2001; Koster,Crombez, Verschuere, & De Houwer, 2004; Mathews, Fox, Yiend,& Calder, 2003).

The ability of EFE to mobilize attentional resources was furtherconfirmed by imaging studies using Event-Related brain Poten-tials (ERP) or functional MRI analyses. Sato, Kochiyama, Yoshikawa,and Matsumura (2001), in a gender discrimination task, found thatfaces with fearful and happy emotional expressions, comparedto faces with neutral expression, elicited a larger negative peakat about 270 ms (N270) over the posterior temporal areas.Vuilleumier, Armony, Driver, and Dolan (2001) observed thatthe activity of the amygdala, a key structure of emotional proces-sing, was similar both when spatial attention was directed towardsfearful faces and when it was not. Using pictures of natural scenes,Carretie, Hinojosa, Martin-Loeches, Mercado, and Tapia (2004)showed emotional modulations of early event-related components(P1, P2 and N2) in an oddball paradigm with neutral, pleasant andunpleasant deviant pictures, while the participants had to focus onthe color of the frame surrounding the pictures.

Not only are EFE processed despite the decrease of focusedattentional resources, but also they recruit a specific neuralnetwork, at least partially different from the one recruited whenthe processing of EFE is explicit. Critchley et al. (2000) reachedthis conclusion from fMRI experiments where the participantshad to judge the emotional expression of happy, angry andneutral faces or the facial gender of the same faces. In a recentinvestigation, Van Strien, De Sonneville, and Franken (2010)found a difference in the location of the late positive potential(LPP) as amplitudes were larger at centro-frontal electrodesduring valence ratings (explicit processing) and at midline par-ietal electrodes during gender identification (implicit processingof emotional content).

It is noteworthy that the investigation of the implicit proces-sing of emotional facial expressions relies on studies focused onstimulations projected to the center of the visual field. Yet, mostvisual events occur in the peripheral visual field and causesaccades which displace targets of interest into central visionwhere detailed visual analysis can be performed (Liversedge &Findlay, 2000). This implies that some features are selected inperipheral vision (PV) for saccadic capture. Furthermore, despitethe ability of emotional information to automatically capture aportion of attentional resources in implicit conditions, the persis-tence of this ability in visual impoverished situations such as PVhas never been studied. In fact, a few studies investigated theemotional processing in PV but only in explicit conditions (seeRigoulot et al. 2008, 2011). In addition, some studies investigatedthe implicit processing of EFE in parafoveal locations (around 51of eccentricity) with dot-probe tasks (e.g., Fox, Derakshan, &Shoker, 2008; Pourtois, Grandjean, Sander, & Vuilleumier, 2004;Pourtois, Dan, Grandjean, Sander, & Vuilleumier, 2005) andshowed that threatening faces, but not happy ones, enhancedearly electrophysiological components in comparison with neu-tral ones (C1 and P100; Pourtois et al., 2004, at 4.11 of eccen-tricity). Another study (Bayle, Henaff, & Krolak-Salmon, 2009)used magnetoencephalography to investigate the processing ofemotional facial expressions. Neutral and fearful faces, verybriefly presented at 51 of eccentricity, were followed by targetfaces without facial expressions and the brain responses after theonset of the first faces were analyzed. The authors observedincreased cerebral activations in the amygdala and the fusiformgyrus in response to fearful expressions, in line with the alleged

role of these structures in the processing of emotional expressions(Adolphs, 2002).

Altogether, these data suggest that EFE can be processed inparafoveal vision and outside the attentional focus. However, thequestion of the persistence of an implicit processing of EFE in farperiphery, where most visual events first occur, remains unan-swered. Here we investigated how EFE influence the neurobeha-vioral activity of participants involved in a discrimination task inwhich emotional information is useless. We also ensured that theemotional faces were really outside the central visual field byprojecting them in far periphery (up to 301 of eccentricity)because responses evoked by parafoveal stimuli, as in Bayle andcolleague’s study, did not differ from those obtained in centralvision (see Kaplan, 2004; Martin & Grunert, 2004). Finally, weaddressed the question of the ability of happy faces to mobilize asefficiently as fearful faces attentional resources in PV (seePourtois et al., 2004, 2005).

Our hypothesis is that fearful and happy expressions modulatebehavioral and electrophysiological responses to faces appearingin PV, even when the attention of the participants is not directedtowards the emotional content of faces. Accordingly, we projectedin peripheral vision fearful, neutral and happy faces of whichparticipants had to categorize the gender. The behavioral andcerebral modulations related to the emotional expression of faceswere evaluated by means of reaction times and ERPs recordings.Considering the rare available electrophysiological data related toperipheral vision, in addition to a baseline to peak analysis, anexploratory spatio-temporal principal component analysis (PCA)was chosen to determine the spatial and temporal features ofinterest in the ERPs data. Indeed, PCA techniques have alreadydemonstrated their relevance to explore ERPs in response toemotional stimuli (Delplanque, Lavoie, Hot, Silvert, & Sequeira,2004; Delplanque, Silvert, Hot, & Sequeira, 2005; Delplanque,Silvert, Hot, Rigoulot, & Sequeira, 2006; Hot, Saito, Mandai,Kobayashi, & Sequeira, 2006; Kayser & Tenke, 2003; Rigoulotet al., 2008).

2. Methods

2.1. Participants

Two experiments were designed. Sixteen women (all right-handed; mean age: 19.173.5 years) were included in the firstexperiment and 16 other (15 right-handed and 1 left-handed;mean age: 19.971.7 years) in the second one. Only women wereincluded in these experiments because an important body ofliterature suggests that women are more reactive to emotionalinformation than men (Bradley, Codispoti, Sabatinelli, & Lang,2001; Campanella et al., 2004; Collignon et al., 2010; Larsen &Diener, 1987). They all had normal or corrected to normal visionand none of them had a history of neurological or psychiatricdisorder, or drug consumption. Prior to the experiment, partici-pants were given forms in order to get their consent, to test theirhandedness (Hecaen, 1984), and were paid 20 Euros for theirparticipation. The study was approved by the ethics committee ofthe Centre National de la Recherche Scientifique (CNRS).

2.2. Apparatus and stimuli

Three projectors (SONY CS5, Tokyo, Japan) displayed thestimuli on a panoramic semi-circular light-grey (68 cd/m2) screencovering almost the entire visual field (1801). The projectors wereconnected to a computer devoted to the presentation of thestimuli (Hewlett-Packard Pentium III, 1000 MHz).

Page 3: Implicit emotional processing in peripheral vision: Behavioral and neural evidence

S. Rigoulot et al. / Neuropsychologia 50 (2012) 2887–2896 2889

In a preliminary experiment, faces from NimStim (MacBrainFace Stimulus Set; Tottenham et al., 2009) and KDEF (KarolinskaDirected Emotional Faces System; Lundqvist, Flykt, & Ohman,1998) datasets were projected in central vision of twenty-twofemale participants (mean age: 22.572.0 years). For each face,participants had to assess the emotional expression they per-ceived (free answer) and the certainty of their answer (on a 0 to10 scale). We selected three sets of 24 faces evaluated asexpressing fear (F), happiness (H) or being neutral (N), by at least70% of the subjects, with a median of certainty of at least 7. Thesets included the same number of male and female faces. Neutraland fearful faces were projected during the first study and neutraland happy faces during the second study.

The mean luminance and colour saturation (for red, green andblue) were measured for each picture and these values wereadjusted for some pictures in order to maintain homogeneitybetween the three sets (Adobe Photoshop software, see Calvo &Lang, 2005). Separate analyses of variance (ANOVAs) performedon the mean luminance, the colour saturation and the standarddeviation of luminance values (i.e. a contrast index) did not revealany significant difference between fearful, happy and neutral sets(all Fs (1,46)o0.57 and ps40.40).

2.3. Procedure

Participants were seated in a chair in front of the panoramicscreen in a dimly lit room for both studies. Each picture (640�822pixels; size: 15.61�201; distance: 2.1 m) was projected on thescreen, at four different spatial positions: �301 and �151 (far andnear periphery in the left visual field); þ151 and þ301 (near andfar periphery in the right visual field). For faces centered at 151 ofeccentricity, the inner border of the pictures was at 7.21 and theouter border at 22.81. For faces centered at 301, the inner borderwas at 22.21 and the outer border at 37.81 of eccentricity.

The whole dataset was split into four blocks of 48 pictures(24 F and 24 N in the first study and 24 H and 24 N in the secondone). Each picture was presented once per block and was repeatedfour times across blocks, in such a way that, at the end of theexperiment, all the pictures had appeared at the four spatialpositions. The order of presentation of the pictures was counter-balanced to avoid confusing effects of laterality (first presentationof pictures in right or left visual field) and of eccentricity (firstpresentation of pictures in near or far eccentricity).

Each trial was set up as follows. Participants had to keep theireyes on a central cross of fixation. A face was projected for 300 msat one of the four spatial positions and had to be categorizedaccording to gender (male or female) with the help of a two-button response box (Fig. 1). The hands and the order of thebuttons were counterbalanced across participants. The inter-stimulus interval randomly varied between 2.5 and 4 s. Beforethe experiment, participants were trained with a small set ofdifferent pictures.

2.4. Electroencephalogram (EEG) recording and signal processing

The EEG system was composed of 128 electrodes cap (Electrocap),an amplifier (Advanced Neuro Technology, ANT) and a recordingcomputer. Electroencephalographic activity (EEG, ANT system)was recorded at 63 electrode sites4 of the extended 10–20 systemusing tin electrodes inserted in an elastic nylon cap (ElectrocapInternational) with an online averaged reference and a forehead

4 From frontal to occipital locations: Fpz, Fp1, Fp2, AFz, AF3, AF4, Fz, F1, F2, F3,

F4, F5, F6, F7, F8, FCz, FC1, FC2, FC3, FC4, FC5, FC6, FT9, FT10, Cz, C1, C2, C3, C4, C5,

C6, T7, T8, CPz, CP1, CP2, CP3, CP4, CP5, CP6, TP7, TP8, TP9, TP10, A1, A2, Pz, P1, P2,

P3, P4, P5, P6, P7, P8, POz, PO3, PO4, PO7, PO8, Oz, O1, O2.

ground. Seven additional electrodes were placed, one at thenasion (usual reference) and six others for vertical and horizontalelectro-oculographic recordings: two at the outer canthi of eyesand one above and below each eye. The impedance for all theelectrode sites was kept below 5 kO. The EEG was continuouslydigitized at 1024 Hz. After the recording, the bandpass was setbetween 0.016 and 30 Hz. All trials with artifacts exceeding7100 mV were excluded from the analysis. During the analysis,every trials were manually checked by the first author with thesoftware ASA (ANT) and any trials with blinks or drifting electrodesin the (�100 to 800 ms) temporal window were rejected from thefurther analyses. Moreover, all trials containing eye movementsafter the presentation of the stimulus were rejected to exclude thepossible saccades towards the location of visual stimulus (Rayner,1998). Across all conditions, 9% of the trials were rejected. EEGepochs (�100 to 800 ms) were time-locked to the stimulus onset,baseline corrected (�100 to 0 ms) and averaged offline accordingto spatial position and emotional facial expression (all processingwas done on the averaged reference data).

Spatio-temporal principal component analysis (st-PCA) con-sists in two PCA on EEG waveforms reducing both the spatial andtemporal dimensionalities of the dataset. The details of themethod are reported in (Spencer, Dien and Donchin, 1999,2001). In a first step, a spatial PCA was performed with electrodesites as variables and time points, participants and conditions asobservations. Spatial PCA identifies groups of electrodes of whichthe activity is highly correlated and reduces them by hypotheticaland unobserved dependent variables (Pourtois, Delplanque,Michel, & Vuilleumier, 2008). Each extracted factor can beconsidered as one specific spatial configuration of brain activa-tion, called ‘‘virtual electrode’’ by Spencer and coworkers. In asecond step, the factor scores of these ‘‘virtual electrodes’’ (i.e. thecontribution of a spatial factor to the voltage observed for eachoriginal waveform) are considered as ‘‘virtual ERPs’’. The corre-sponding time series are submitted to a temporal PCA with timepoints as variables and ‘‘virtual electrodes’’, participants andconditions as observations. Temporal PCA identifies groups oftime points which are highly correlated and reduces them byhypothetical and unobserved variables (i.e. temporal factors). Inboth studies, we chose to select the spatial and temporal factorsthat were accounting for the first 90% of the variance (the firsttemporal factor was also excluded from the further analysisbecause it is related to the auto-correlation of the data (seeSpencer et al., 1999 for more details). The result of completespatio-temporal PCA procedure is a set of factor scores used hereto compare the activity of ‘‘virtual electrodes’’ at particular times(i.e. for each temporal factor) in response to the different pictures,presented at each visual spatial position. In both experiments,spatial and temporal PCA were performed with 63 electrodes and818 temporal points respectively as variables and with sixteenparticipants, four spatial positions and two emotional facialexpressions as observations.

Moreover, in order to investigate the early occipital activity inresponse to peripherally faces, we first conducted a baseline-to-peak analysis (seeking for maximum amplitude and the latency ofthis maximum) on two early ERP components as eccentric andemotional effects on these components were consistentlyreported in previous studies (see Rigoulot et al., 2009, 2011). Thisanalysis was run between 65 and 125 ms for parieto-occipitalelectrodes (PO7, Oz, PO8) and between 140 and 240 ms for twoparieto-temporal electrodes (P7 and P8), corresponding to theonsets of the classical components evoked by faces when they aredisplayed in central vision, the P100 and N170. These timewindows and electrodes were defined after examination of thegrand average of the data and were equivalent to those used inother studies (Ashley et al., 2004; Batty & Taylor, 2003).

Page 4: Implicit emotional processing in peripheral vision: Behavioral and neural evidence

Fig. 1. Panoramic screen and time course of trials. (a) Spatial positions for pictures’ projection on the panoramic screen. (b) Duration of each picture presentation (300 ms)

and inter-stimulus interval (from 2.5 to 4 s) during two successive trials. Instructions to the participants are indicated in the balloon.

S. Rigoulot et al. / Neuropsychologia 50 (2012) 2887–28962890

2.5. Data analyses

Behavioral data concerned the accuracy and the reaction times ofthe participants for each picture. On these data, we performed arepeated-measures analysis of variance (ANOVA) with the eccentri-city (15, 30), side (left, right) and emotional expression (F and N in thefirst study or H and N in the second) of faces as within-subject factors.A significance level of 5% (two-sided) was selected. For electrophy-siological data, the ANOVAs were performed according the samedesign, on the individual temporal factor scores provided by thecomplete st-PCA procedure. When required, post hoc analyses wereperformed using the Tukey HSD test. A repeated measures ANOVAwith eccentricity, side, emotional facial expression and electrodes ofinterest as within-subject factors was performed on the amplitudesand latencies obtained from the baseline-to-peak analysis.

3. Results

3.1. Fearful vs. neutral faces

3.1.1. Behavioral data

The ANOVA on the accuracy, i.e. correct categorization of thegender of faces, revealed a marginally significant effect of emotionalfacial expression. On average, participants were less accurate inresponse to fearful faces than to neutral ones (F(1,15)¼3.73;p¼0.072; neutral: 83.5%737; fearful: 81.4%739). This marginaleffect did no vary significantly as a function of side or eccentricity.There was also a main effect of eccentricity (F(1,15)¼10.51;p¼0.005) as participants were more accurate at 151 of eccentricitythan at 30 (83.9% vs. 81.0%).

A similar ANOVA on the participant’s mean reaction timesrevealed an effect of emotional facial expression (F(1,15)¼4.62;p¼0.048) while participants took longer time to respond tofearful faces (Fig. 2a). This effect did not depend on side oreccentricity, although there was a trend that reaction times werelonger at 301 than at 151 (F(1,15)¼3.84; p¼0.069).

3.1.2. Electrophysiological data

3.1.2.1. P100. A Greenhouse-Geisser corrected ANOVA (emotion,side, eccentricity, electrode) was run between 65 and 125 ms for

three parieto-occipital electrodes (PO7, Oz, PO8). This revealed aninteraction between the side of presentation of the faces and theelectrodes (F(2,30)¼14.77; po0.001). Further analyses (Tukey’spost-hoc) showed that on PO8, P100 amplitude was higher whenthe faces were displayed in the left visual field than when theywere presented in the right visual field (p¼0.062) whereas onPO7, P100 amplitude was higher when faces were seen in theright visual filed than when faces were projected in the left visualfield (p¼0.001). No differences were seen in the amplitude of theP100 on Oz as a function of the side of presentation (p¼0.466).The same ANOVA performed on the latency of the P100 revealedan interaction between the emotional expression of the faces, theside of presentation and the degree of eccentricity (F(1,15)¼11.70; p¼0.004). Post-hoc analyses demonstrated that thelatency of P100 was longer for fearful than for neutral faces(107716 ms vs. 99718 ms) when faces were presented in theleft visual field at 301 of eccentricity (p¼0.031).

3.1.2.2. N170. An ANOVA (emotion, side, eccentricity, electrode)was run between 140 and 220 ms for two parietal electrodes (P7,P8). This analysis did not show any differences on the amplitudeson the N170 component. When the latency of these peaks wasconsidered, the statistic analysis revealed an interaction betweenthe side of presentation and the electrode (F(1,15)¼36.95;po0.001). Further analysis demonstrated that the latency of thepeak of N170 on P7 was longer when the faces were displayed inthe left visual field than when they were displayed in the rightvisual field (183710 vs. 17174 ms; p¼0.027). In the opposite,the latency of the peak on P8 was longer when the faces weredisplayed in the right visual field than when they were projected inthe left visual field (193711 vs. 173710 ms; po0.001).

3.1.2.3. PCA data. Spatial PCA extracted eight factors whose scoreswere then submitted to a temporal PCA. The temporal PCA extracted29 temporal factors and the first 10 (explaining the first 90% of thevariance) were selected. For better clarity of the results provided byst-PCA procedure, we report only the comparisons thatdemonstrated significant differences for the emotional content orsignificant interaction between emotional content and side oreccentricity of the pictures. The first significant effect appearedafter 100 ms (as indicated by the maximum of TF9 loadings) mainly

Page 5: Implicit emotional processing in peripheral vision: Behavioral and neural evidence

Fig. 2. (Color online) Modulations of behavioral and electrophysiological responses by fearful and neutral faces. (a) Reaction times. (b) Grand-averaged ERPs to fearful (red)

and neutral (black) faces (n¼16). Electrodes were chosen as a function of sPCA topographies. Dashed lines correspond to the moment when emotional modulations are

significant. (c) Temporal factor (TF) loadings as a function of time. The explained percentage of variance is indicated for each factor. (d) Table illustrating significant

modulations on ERP activity induced by emotional content of faces. Vertically, spatial factors (SF) with topographical representations in the left column. Horizontally,

temporal factors (TF) are organized according to the onset of their peak loadings, from the earliest in the left to the latest in the right. In corresponding cases are

represented the TF scores as a function of facial emotional expression (fear in red, neutral in blue). The factor scores (y axis) is a unitless dimension. (For interpretation of

the references to colour in this figure legend, the reader is referred to the web version of this article.)

S. Rigoulot et al. / Neuropsychologia 50 (2012) 2887–2896 2891

on left temporal areas (as indicated by the maximum of SF6loadings) and this was a double interaction between emotionalcontent, side and eccentricity (F(1,15)¼17.29; p¼0.001; Fig. 2b–d).This interaction allowed us to investigate the emotional effect foreach side. No significant difference was found in the left visual field.In the right visual field, an interaction between emotionalexpression and eccentricity was revealed (F(1,15)¼7.91; p¼0.013)and further analyses showed higher scores for fearful faces than forneutral ones when projected at þ151 (p¼0.020). Later on (120 ms,TF8), a significant double interaction appeared in the right temporalareas (SF5; F(1,15)¼7.34; p¼0.016). Further analyses for each sideshowed a significant interaction between emotional content of facesand eccentricity when pictures were presented in the left visual field(F(1,15)¼8.48; p¼0.011), with higher scores for fearful faces thanfor neutral ones at �151 (p¼0.037). No significant difference wasfound in the right visual field. Finally, a marginally significant effectof emotional content appeared in parietal areas (SF2) around 150 msafter stimulus onset (TF7) with higher scores for fearful faces(F(1,15)¼4.04; p¼0.063). This effect did not vary as a function ofside or eccentricity. To summarize, fearful faces increased theelectrophysiological activity of three early components peaking at100, 120 and 150 ms after the onset of the stimulus, particularly at7151, and no later effects were found.

3.2. Happy vs. neutral faces

3.2.1. Behavioral data

The analysis of accuracy revealed a significant effect of emotionalfacial expression with higher accuracy for neutral than for happy

faces (F(1,15)¼36.40; po0.001; neutral: 84.4%736; happy:79.1%740). This effect was qualified by the eccentricity (F(1,15)¼4.75; p¼0.045) but not by the side of presentation (F(1,15)¼1.37;p¼0.262). Post hoc analyses revealed that the accuracy was higher forneutral than for happy faces at 301 and 151 of eccentricity, thedifference reaching significance at 301 (p¼0.002).

A similar ANOVA on the mean reaction times revealed aneffect of emotional facial expression (F(1,15)¼8.02; p¼0.013;Fig. 3a) while participants took longer time to respond to happyfaces. This effect did not depend on side or eccentricity. Inaddition, eccentricity showed a main effect (F(1,15)¼20.73;po0.001). Reaction times were shorter for near than for fareccentricities (�301: 755 ms7269; �151: 708 ms7247; þ151:698 ms7232; þ301: 746 ms7264).

3.2.2. Electrophysiological data

3.2.2.1. P100. A Greenhouse-Geisser corrected ANOVA (emotion,side, eccentricity, electrode) revealed an interaction between theside of presentation of the faces, the degree of eccentricity ofpresentation and the electrodes (F(2,30)¼23.82; po0.001).Further analyses (Tukey’s post-hoc) showed that on PO8, P100amplitude was higher when the faces were displayed in the leftvisual field than when they were presented in the right visualfield (p¼0.063) whereas on PO7, P100 amplitude was higherwhen faces were seen in the right visual filed than when faceswere projected in the left visual field (p¼0.031). No otherdifferences were observed.

Page 6: Implicit emotional processing in peripheral vision: Behavioral and neural evidence

Fig. 3. Modulations of behavioral and electrophysiological responses by happy and neutral faces. (a) Reaction times. (b) Grand-averaged ERPs to happy (red) and neutral

(black) faces (n¼16). Electrodes were selected as a function of sPCA topographies. Dashed lines correspond to the moment when emotional modulations are significant. (c)

Temporal factor (TF) loadings as a function of time. The explained percentage of variance is indicated for each factor. (d) Table illustrating significant modulations on ERP

activity induced by emotional content of faces. Vertically, spatial factors (SF) with topographical representations in the left column. Horizontally, temporal factors (TF) are

organized according to the onset of their peak loadings, from the earliest in the left to the latest in the right. In corresponding cases are represented the TF scores as a

function of facial emotional expression (happy in green, neutral in blue). The factor scores (y axis) is a unitless dimension. A decrease of behavioral performance was found

for fearful and happy faces. Early evoked components are enhanced by the emotional expression of faces. Emotional expression of faces is implicitly processed in peripheral

vision. (For interpretation of the references to colour in this figure legend, the reader is referred to the web version of this article.)

S. Rigoulot et al. / Neuropsychologia 50 (2012) 2887–28962892

3.2.2.2. N170. An ANOVA (emotion, side, eccentricity, electrode)did not show any effect on the amplitudes of the N170. However,the statistical analysis performed on the latency of thesepeaks revealed an interaction between the side of presentationand the electrode (F(1,15)¼33.22; po0.001). Further analysisdemonstrated that the latency of the peak of N170 on P7 waslonger when the faces were displayed in the left visual field thanwhen they were displayed in the right visual field (18777 vs.17472 ms; p¼0.003). In the opposite, the latency of the peak onP8 was longer when the faces were displayed in the right visualfield than when they were projected in the left visual field(18776 vs. 17275 ms; po0.001).

3.2.2.3. PCA data. Spatial PCA extracted eight factors whose scoreswere then submitted to a temporal PCA. The temporal PCAextracted 28 temporal factors and the first nine (explaining thefirst 90% of the variance) were selected. Again, we report only thecomparisons that demonstrated significant differences for theemotional content or significant interaction between emotionalcontent and side or eccentricity. The first significant effect occurred

about 95 ms after the onset of the face (as indicated by themaximum of TF9 loadings; Fig. 3) mainly on right parieto-occipital areas (as indicated by the maximum of SF3 loadings)with higher scores for happy faces than for neutral ones(F(1,15)¼4.92; p¼0.042). This effect did not vary as a function ofside or eccentricity. Later on (150 ms, TF7), there was a significantinteraction between emotional facial expression and side of facesin the right temporal areas (SF6; F(1,15)¼8.42; p¼0.011). Whenfaces appeared in the left visual field, scores were more negativefor happy faces than for neutral ones (F(1,15)¼5.38; p¼0.035).When faces appeared in the right visual field, the difference wasnot significant. At 325 ms (TF2), in frontal areas (SF2), scores weremore negative for neutral faces than for happy ones (F(1,15)¼4.75;p¼0.046). Analyses did not reveal that such effects varied as afunction of side or eccentricity. Finally, around 475 ms afterstimulus onset (TF3), in parietal areas (SF1), a significantinteraction between emotion and eccentricity was found(F(1,15)¼6.10; p¼0.026) as scores were higher for neutral facesthan for happy faces, only at 301. In brief, the analyses revealed thathappy faces increased the activity of early components peaking at95 and 150 ms after the onset of the stimuli whereas neutral faces

Page 7: Implicit emotional processing in peripheral vision: Behavioral and neural evidence

S. Rigoulot et al. / Neuropsychologia 50 (2012) 2887–2896 2893

enhanced electrophysiological activity in late temporal windows(325 and 475 ms; see Fig. 3b-d).

4. Discussion

The aim of this study was to assess whether the implicitprocessing of faces displayed in far periphery could still bemodulated by their emotional expression. In support of ourhypothesis, we found that the emotional expression of facesmodulated the behavioral and electrophysiological responsesrecorded during a gender discrimination task. These resultsprovide evidence that emotional information is automaticallyprocessed, even when irrelevant to the task and displayed inlow visual conditions. This highlights the persistence of emotionalsaliency in peripheral vision, i.e. in impoverished visual condi-tions. We will first discuss the modulations of the processing offaces by the eccentricity of display, then the modifications relatedto the emotional expression of faces and finally, we will compareexplicit and implicit processing of emotional information.

4.1. Eccentricity effects

First, and as expected, we found a gradual decrease of behavioralperformance with eccentricity as revealed by the lower rate ofcorrect categorization and the higher reaction times (see Rigoulotet al., 2008, 2011). In a similar way, Thorpe, Gegenfurtner, Fabre-Thorpe, and Bulthoff (2001) found a linear decrease of performancein peripheral vision when participants had to categorize picturesdisplayed at different eccentricities, up to 901. The decline ofperformance could be explained by low acuity of peripheral visiondue to the almost linear decrease of ganglionar cells observed instudies of retinal anatomy (Curcio & Allen, 1990 in humans; Wassle,Grunert, Rohrenbeck & Boycott, 1990 in monkeys).

Second, we observed in both studies that the amplitude of theP100 on a specific electrode was higher when the faces wereprojected in the contralateral visual field. This is in line with theprevious ERPs investigations devoted to the emotional processing ofnatural scenes and faces presented in peripheral vision. Across threestudies, Rigoulot et al. (2008, 2011 and here) observed one compo-nent triggered around 100 ms after the onset of stimulus andmaximal in the parieto-occipital areas contralateral to the stimuluspresentation. This P100-like could be delayed with eccentricity andwas more lateral than the P100 classically observed in studiesdisplaying stimulations in the central visual field (Rigoulot et al.,2008). This topography of the P100-like could reflect the crossing ofvisual pathways and the more direct innervation of the lateralsecondary visual areas by the magnocellular pathway which mainlyoriginates in peripheral retina (see Rigoulot et al., 2008, 2011). Inaddition, we observed the same pattern with the N170. We alsoobserved that the latency of the N170 peaking in left temporal areaswas earlier when faces were displayed in the right visual field thanwhen they were displayed in the left visual field (and vice-versa).Again, this is in line with our previous observations (Rigoulot et al.,2011) and with an fMRI experiment, which found, in response tolateralized faces, a maximal contralateral activity in the fusiformgyrus, known as the cortical generator of the N170 (Hemond,Kanwisher, & Op de Beeck, 2007).

4.2. Emotional effects

We found lower accuracy and longer reaction times inresponse to fearful and happy faces than in response to neutralones. Indeed, it can be supposed that a portion of attentionalresources allocated to the gender discrimination task have beenautomatically captured to process the emotional expression of the

faces. This interpretation is consistent with the observation that ittakes longer time to disengage spatial visual attention fromthreat-related stimuli than from neutral stimuli (e.g., Stein,Zwickel, Ritter, Kitzmantel, & Schneider, 2009). This attentionalcapture, evidenced here for the first time in far peripheral vision,is in line with the necessity to react to relevant stimulations,positive or negative ones, even if attention is not primarilydirected towards them (e.g., Brosch et al., 2007; Brosch, Sander,Pourtois & Scherer, 2008; Buodo, Sarlo, & Palomba, 2002; Mogg &Bradley, 1999; Schimmack & Derryberry, 2005; Vogt, De Houwer,Koster, Van Damme, & Crombez, 2008; Williams, McGlone,Abbott, & Mattingley, 2005). Another possible interpretation ofthese modulations is that the categorization of emotional faceswas more difficult than that of neutral faces. However, the greatcare in selecting faces in the present study, particularly incontrolling their percentage of recognition and their psychophy-sical characteristics, make this interpretation unlikely. Moreover,it has already been observed that emotional stimulations dis-played in central vision could slow the processing of tasks (seeFruhholz et al., 2011; Williams et al., 1996 who used emotionalStroop tasks). Given that, our data plead in favour of a persistenceof the saliency of emotional information in far periphery.

To note, we found a relative important difference in the meanreaction times in response to emotional and neutral faces betweenthe first and the second experiment (650 ms and 720 ms, respec-tively). This difference probably reflects group differences that makeit difficult to compare specific emotion effects. It is also possible thatpresentation of specifically positive or negative stimulations couldhave created an affective context (positive or negative) that couldhave influenced the behavioral responses of the participants (seeAviezer et al., 2008). Further studies will be needed to understandfully the differences between fearful and happy faces because ourexperimental design does not allow making this comparison. Inparticular, it is not possible yet to clarify the role of factors such ashedonic valence or arousal since happy and fearful faces were notembedded in the same experimental design.

Electrophysiologically, fearful and happy faces modulated com-ponents in early temporal windows. The present study thus extendsto far eccentricities findings limited until now to extrafovealprocessing of emotion with faces and natural scenes (Bayle et al.,2009; De Cesarei, Codispoti, & Schupp, 2009; Rigoulot et al., 2008,2011). Despite the fact that the task did not concern the emotionalcontent of the faces and was carried out in low visual acuityconditions, the brain was able to extract the emotional value veryquickly. Like the aforementioned behavioral results, the earlymodulation of electrophysiological components could be related toan automatic capture of attentional resources. The earlier emotionaleffects occurred in occipital and temporal areas, in accordance withelectrophysiological data obtained in central vision studies (Pegna,Landis, & Khateb, 2008; Sato et al., 2001). The implication of thetemporal lobe remains coherent with the emotional and socialintegrating role of this region, suggested by lesions studies (Rolls,2007). Indeed, neurons in the superior temporal sulcus (STS) couldrepresent perceived emotions at an abstract, modality-independentlevel, and thus play a key role in the understanding and categoriza-tion of others’ emotional mental states (Peelen, Atkinson, &Vuilleumier, 2010). Even if not completely surprising, it is note-worthy that the temporal emotional coding remains available forEFE presented in peripheral vision. The large magnocellular inputsinto the temporo-parietal lobe (Bullier, 2001) could then contributeto explain the persistence of the temporal participation in emotionalactivation induced at far eccentricities.

Moreover, the rapidity of emotional effects, coherent with thenecessity to respond quickly to salient information, could bemediated by the colliculo-pulvino-amygdalian pathway. As sug-gested by others (Morris, DeGelder, Weiskrantz, & Dolan, 2001;

Page 8: Implicit emotional processing in peripheral vision: Behavioral and neural evidence

5 For both experiments of the current study, we also ran analyses of variance

on EPN and LPP amplitudes with eccentricity, side and emotional expression of

faces as within-subject factors, from the temporal windows and electrodes

described in De Cesarei et al. (2009). For the fearful vs. neutral faces experiment,

the analysis on EPN amplitudes revealed an effect of the degree of eccentricity

(F(1,15)¼4.544; p¼0.050) but did not reveal an effect of the emotional expression

of faces (F(1,15)¼0.623; p¼0.442). The same analysis on LPP amplitudes did not

show an effect of emotional expression of faces (F(1,15)¼1.345; p¼0.264) and

found a marginally significant interaction between the side of presentation and

the degree of eccentricity (F(1,15)¼4.049; p¼0.063). For the happy vs. neutral

faces experiment, the analysis on EPN amplitudes did not reveal an effect of the

emotional expression of faces (F(1,15)¼1.693; p¼0.213). The same analysis on

LPP amplitudes demonstrated an interaction between the emotional expression of

faces and the side of presentation of faces (F(1,15)¼7.406; p¼0.016). However,

post-hoc analyses did not reveal any significant emotional differences for each

side (ps40.224).

S. Rigoulot et al. / Neuropsychologia 50 (2012) 2887–28962894

Vuilleumier, Armony, Driver, & Dolan, 2003), this old subcorticalroute, supported by magnocellular neurons, could provide rapidcoarse cues based on low-spatial frequencies (LSF, Schyns & Oliva,1999). Given that the peripheral vision is mainly related to themagnocellular pathway, the involvement of this pathway becomesa good candidate to mediate, probably through LSF information,quickly discrimination of EFE presented in far eccentricities.

Finally, late components were modulated only in the case ofhappy faces. This pattern of activations suggests that the happyfaces trigger more robust activations across time and visual fieldthan fearful faces. The reverse is usually reported, at least whenstimulations are displayed in the central visual field. One inter-pretation of the present data could be that happy faces are moreeasily recognized in peripheral vision than fearful faces (see Calvo& Nummenmaa, 2009).

4.3. Explicit versus implicit processes

On the basis of previous observations that the explicit andimplicit processes of emotional faces can involve different networks(see Monroe et al., in press), it seems of key interest to compare bothprocesses following stimulations in peripheral vision.

Behaviorally, fearful (Rigoulot et al., 2011) and happy (unpub-lished data) faces were associated with faster responses thanneutral ones, in explicit conditions. In present implicit conditions,emotional faces increase the reaction times in comparison toneutral ones. This apparent disparity is coherent with the fact thatemotional saliency can be shown in two ways. In some cases, theprocessing of a target was facilitated when it was cued by anemotional item, in comparison with a target cued by a neutral item(e.g., Mogg & Bradley, 1999). In other cases, when the task did notfocus attention on emotional features, disruptive effects wereobserved. For example, in emotional Stroop tasks, authors foundslower response times to respond to the ink colour of emotionalwords than to that of neutral words (see Frings, Englert, Wentura,& Bermeitinger, 2010 for a discussion). On the whole, these resultspoint up the saliency of emotional stimuli by providing evidencethat processing of emotional information is irrepressible.

Concerning electrophysiological effects, in implicit conditions,happy and fearful faces were able to increase cerebral activity onlyin early temporal windows, whereas a similar modulation wasobserved for early and late components in explicit conditions withfearful faces (Rigoulot et al., 2011). Because the task was directedtowards the emotional content of the faces in explicit conditions, theconsequent sustained attention probably explains the larger activa-tion of late components (Schupp et al., 2004). In implicit conditions,the detection of relevant information corresponding to the emotionalcontent of faces certainly enhances the early components. Afterwards,the attentional resources are directed back according to the goal ofthe participants, i.e. the categorization of the gender of the faces.Consequently, the emotional content of the faces no longer modu-lated late components.

Another important difference with Rigoulot et al. (2011) studyis the absence here of any emotional effect on the amplitude of theN170 component. The ability of emotional information to mod-ulate the amplitude of N170 is still debated since some studieshave found emotional effects (Batty & Taylor, 2003; Eger, Jedynak,Iwaki, & Skrandies, 2003; Pizzagalli et al., 2002) while others havenot (Eimer & Holmes, 2002, 2007; Herrmann et al., 2002; Krolak-Salmon, Fischer, Vighetto, & Mauguiere, 2001). Given our observa-tions, we believe that the explicit/implicit or active/passive natureof the processing of the emotional information might play a role inthe emotional modulations of the N170 triggered by faces appear-ing in peripheral vision. This interpretation is consistent with thedata of De Cesarei et al. (2009) who found an emotional modula-tion on the early posterior negativity (EPN) and the late positive

potential (LPP) only when participants were actively looking atemotional and neutral scenes and not when the same participantswere passively looking at the same pictures5 .

One factor can limit our conclusions. Since we have not screenedour participants with BECK or STAI inventories, it is not possibleto completely overrule the possibility that anxiety and/or depressionhave played a role in the results described in our study. Indeed,in oddball paradigms, (Rossignol, Philippot, Douilliez, Cromme-linck, & Campanella, 2005; Rossignol, Philippot, Crommelinck, &Campanella, 2008) have shown that the processing of EFE increasedamplitudes of N2/P3a complex, decreased amplitude of N300 and anearlier onset of P3b component for anxious and depressed partici-pants compared to control participants. However, we have screenedparticipants in our two previous studies (Rigoulot et al., 2008, 2011)with STAI inventory and no one scored too high on this scale. As theparticipants of the present study were recruited in the samepopulation, it is unlikely that high levels of anxiety or depressioncould have played a role in the data described here.

To conclude, this work brings original arguments in favor ofthe highly salient role of EFE by combining a reduction of bothvisual acuity and attentional resources. The persistent impact ofemotional expressions across the peripheral visual field whenattention is not directed to emotional features confirms theiradaptive and informative value, particularly in social communica-tion (Adolphs, 2003). Moreover, in a neurovisual rehabilitationperspective, the saliency of emotional faces might be useful whenvisual capacities are reduced in the centre of the visual field(Kerkhoff, 2000). The ability of such stimuli, in particular happyfaces, to capture attentional resources in peripheral visual fieldcould favour the reeducation of patients with macular diseases.Indeed, the saliency of happy faces could help to anchor theattention of these patients at specific points of the eccentric visualfield, which is known to increase their peripheral visual abilitieswith training (Altpeter, Mackeben, & Trauzettel-Klosinski, 2000).

Acknowledgements

This study was supported by a grant of the Minist�ere del’Enseignement Superieur et de la Recherche of France to S.R.and funds from CNRS and Universite de Lille 1 to H.S.

References

Adolphs, R. (2002). Neural systems for recognizing emotion. Current Opinion inNeurobiology, 12, 169–177.

Adolphs, R. (2003). Cognitive neuroscience of human social behaviour. NatureReview Neuroscience, 4, 165–178.

Alpers, G.W., & Gerdes, A.B.M. (2007). Here is looking at you: emotional facespredominate in binocular rivalry. Emotion, 7, 495–506.

Altpeter, E., Mackeben, M., & Trauzettel-Klosinski, S. (2000). The importance ofsustained attention for patients with maculopathies. Vision Research, 40,1539–1547.

Page 9: Implicit emotional processing in peripheral vision: Behavioral and neural evidence

S. Rigoulot et al. / Neuropsychologia 50 (2012) 2887–2896 2895

Armony, J. L., & Dolan, R. J. (2002). Modulation of spatial attention by fear-conditioned stimuli: an event-related fMRI study. Neuropsychologia, 40,817–826.

Aviezer, H., Hassin, R. R., Ryan, J., Grady, C., Susskind, J., Anderson, A., Moscovitch,M., & Bentin, S. (2008). Angry, disgusted, or afraid? Studies on the malleabilityof emotion perception. Psychological Science, 19, 724–732.

Batty, M., & Taylor, M. J. (2003). Early processing of the six basic facial emotionalexpressions. Brain Research Cognitive Brain Research, 17, 613–620.

Bayle, D. J., Henaff, M. A., & Krolak-Salmon, P. (2009). Unconsciously perceived fearin peripheral vision alerts the limbic system: a MEG study. PLoS One, 4, e8207.

Bradley, M. M., Codispoti, M., Sabatinelli, D., & Lang, P. J. (2001). Emotion andmotivation II: sex differences in picture processing. Emotion, 1(3), 300–319.

Brosch, T., Sander, D., Pourtois, G., & Scherer, K. R. (2008). Beyond fear: rapidspatial orienting toward positive emotional stimuli. Psychological Science, 19,362–370.

Brosch, T., Sander, D., & Scherer, K. R. (2007). That baby caught my eyey attentioncapture by infant faces. Emotion, 7, 685–689.

Bullier, J. (2001). Integrated model of visual processing. Brain Res Brain Res Rev, 36,96–107.

Buodo, G., Sarlo, M., & Palomba, D. (2002). Attentional resources measured byreaction times highlight differences within pleasant and unpleasant, higharousing stimuli. Motivation and Emotion, 26, 123–138.

Calvo, M. G., & Lang, P. J. (2005). Parafoveal semantic processing of emotionalvisual scenes. Journal of Experimental Psychology — Human Perception Perfor-mance, 31, 502–519.

Calvo, M. G., & Nummenmaa, L. (2009). Eye-movement assessment of the timecourse in facial expression recognition: neurophysiological implications.Cognitive, Affective & Behavioral Neuroscience, 9(4), 398–411.

Campanella, S., Rossignol, M., Mejias, S., Joassin, F., Maurage, P., Debatisse, D.,Bruyer, R., Crommelinck, M., & Guerit, J. M. (2004). Human gender differencesin an emotional visual oddball task: an event-related potentials study.Neuroscience Letters, 367, 14–18.

Carretie, L., Hinojosa, J. A., Martin-Loeches, M., Mercado, F., & Tapia, M. (2004).Automatic attention to emotional stimuli: neural correlates. Human BrainMapping, 22, 290–299.

Collignon, O., Girard, S., Gosselin, F., Saint-Amour, D., Lepore, F., & Lassonde, M.(2010). Women process multisensory emotion expressions more efficientlythan men. Neuropsychologia, 48(1), 220–225.

Critchley, H., Daly, E., Phillips, M., Brammer, M., Bullmore, E., Williams, S., VanAmelsvoort, T., Robertson, D., David, A., & Murphy, D. (2000). Explicit andimplicit neural mechanisms for processing of social information from facialexpressions: a functional magnetic resonance imaging study. Human BrainMapping, 9, 93–105.

Curcio, C. A., & Allen, K. A. (1990). Topography of ganglion cells in human retina.Journal of Comparative Neurology, 300, 5–25.

Darwin, C. (1872). The expression of the emotions in man and animals (3rd ed.). NewYork: Oxford University Press.

De Cesarei, A., Codispoti, M., & Schupp, H. T. (2009). Peripheral vision andpreferential emotion processing. Neuroreport, 20, 1439–1443.

De Jong, P. J., Koster, E. H. W., van Wees, R., & Martens, S. (2009). Emotional facialexpressions and the attentional blink: attenuated blink for angry and happyfaces irrespective of social anxiety. Cognition & Emotion, 23(8), 1640–1652.

Delplanque, S., Lavoie, M. E., Hot, P., Silvert, L., & Sequeira, H. (2004). Modulation ofcognitive processing by emotional valence studied through event-relatedpotentials in humans. Neuroscience Letters, 35, 1–4.

Delplanque, S., Silvert, L., Hot, P., Rigoulot, S., & Sequeira, H. (2006). Arousal andvalence effects on event-related P3a and P3b during emotional categorization.International Journal of Psychophysiology, 60, 315–322.

Delplanque, S., Silvert, L., Hot, P., & Sequeira, H. (2005). Event-related P3a and P3b inresponse to unpredictable emotional stimuli. Biological Psychology, 68, 107–120.

Edwards, K. (1998). The face of time: temporal cues in facial expressions ofemotion. Psychological Science, 9, 270–276.

Eger, E., Jedynak, A., Iwaki, T., & Skrandies, W. (2003). Rapid extraction ofemotional expression: evidence from evoked potential fields during briefpresentation of face stimuli. Neuropsychologia, 41, 808–817.

Eimer, M., & Holmes, A. (2002). An ERP study on the time course of emotional faceprocessing. Neuroreport, 13, 427–431.

Eimer, M., & Holmes, A. (2007). Event-related brain potential correlates ofemotional face processing. Neuropsychologia, 45, 15–31.

Fox, E., Derakshan, N., & Shoker, L. (2008). Trait anxiety modulates the electro-physiological indices of rapid spatial orienting towards angry faces. Neurore-port, 19, 259–263.

Fox, E., Russo, R., Bowles, R., & Dutton, K. (2001). Do threatening stimuli draw orhold visual attention in subclinical anxiety?. Journal of Experimental Psychology— General, 130, 681–700.

Frings, C., Englert, J., Wentura, D., & Bermeitinger, C. (2010). Decomposing theemotional Stroop effect. Quarterly Journal of Experimental Psychology, 63(1),42–49.

Fruhholz, S., Jellinghaus, A., & Herrmann, M. (2011). Time course of implicitprocessing and explicit processing of emotional faces and emotional words.Biological Psychology, 87(2), 265–274.

Hecaen, H. (1984). Les gauchers: etude neurophysiologique. Paris: PUF.Hemond, C. C., Kanwisher, N. G., & Op de Beeck, H. P. (2007). A preference for

contralateral stimuli in human object- and face-selective cortex. PLoS One, 2, e574.Herrmann, M. J., Aranda, D., Ellgring, H., Mueller, T. J., Strik, W. K., Heidrich, A., &

Fallgatter, A. J. (2002). Face-specific event-related potential in humans is

independent from facial expression. International Journal of Psychophysiology,45, 241–244.

Hot, P., Saito, Y., Mandai, O., Kobayashi, T., & Sequeira, H. (2006). An ERPinvestigation of emotional processing in European and Japanese individuals.Brain Research, 1122, 171–178.

Kaplan, E. (2004). The M, P and K pathways of the primate visual system. In:L. Chalupa, & J. S. Werner (Eds.), The visual neurosciences (pp. 481–493).Cambridge: MIT Press.

Kayser, J., & Tenke, C. E. (2003). Optimizing PCA methodology for ERP componentidentification and measurement: theoretical rationale and empirical evalua-tion. Clinical Neurophysiology, 114, 2307–2325.

Kerkhoff, G. (2000). Neurovisual rehabilitation: recent developments and futuredirections. American Journal of Ophthalmology, 130, 687–688.

Koster, E. H., Crombez, G., Verschuere, B., & De Houwer, J. (2004). Selectiveattention to threat in the dot probe paradigm: differentiating vigilance anddifficulty to disengage. Behavioral Research Therapy, 42, 1183–1192.

Krolak-Salmon, P., Fischer, C., Vighetto, A., & Mauguiere, F. (2001). Processing offacial emotional expression: spatio-temporal data as assessed by scalp event-related potentials. European Journal of Neuroscience, 13, 987–994.

Larsen, R. J., & Diener, E. (1987). Affect intensity as an individual differencecharacteristic: a review. Journal of Research in Personality, 21, 1–39.

Lipp, O. V., & Derakshan, N. (2005). Attentional bias to pictures of fear-relevantanimals in a dot probe task. Emotion, 5, 365–369.

Liversedge, S. P., & Findlay, J. M. (2000). Saccadic eye movements and cognition.Trends in Cognitive Science, 4, 6–14.

Lundqvist, D., Flykt, A., & Ohman, A. (1998). The Karolinska Directed EmotionalFaces — KDEF, CD ROM from Department of Clinical Neuroscience, Psychologysection, Karolinska Institute. ISB, 91–630-7164-9.

Mathews, A., Fox, E., Yiend, J., & Calder, A. (2003). The face of fear: effects of eyegaze and emotion on visual attention. Visual Cognition, 10, 823–835.

Martin, P. R., & Grunert, U. (2004). Ganglion cells in mammalian retinae. In:L. Chalupa, & J. S. Werner (Eds.), The visual neurosciences (pp. 410–421).Cambridge, MA: MIT Press.

Mogg, K., & Bradley, B. P. (1999). Some methodological issues in assessingattentional biases for threatening faces in anxiety: a replication study usinga modified version of the probe detection task. Behavioral Research Therapy, 37,595–604.

Monroe, J.F., Griffin, M., Pinkham, A., Loughead, J., Gur, R.C., Roberts, T.P.,Christopher Edgar, J., The fusiform response to faces: explicit versus implicitprocessing of emotion. Human Brain Mapping, http://dx.doi.org/10.1002/hbm.21406, In press.

Morris, J. S., DeGelder, B., Weiskrantz, L., & Dolan, R. J. (2001). Differentialextrageniculostriate and amygdala responses to presentation of emotionalfaces in a cortically blind field. Brain, 124, 1241–1252.

Morris, J. S., Ohman, A., & Dolan, R. J. (1998). Conscious and unconscious emotionallearning in the human amygdala. Nature, 393(6684), 467–470.

Peelen, M. V., Atkinson, A. P., & Vuilleumier, P. (2010). Supramodal representationsof perceived emotions in the human brain. The Journal of neuroscience, 30(30),10127–10134.

Pegna, A. J., Landis, T., & Khateb, A. (2008). Electrophysiological evidence for earlynon-conscious processing of fearful facial expressions. International Journal ofPsychophysiology, 70, 127–136.

Pizzagalli, D. A., Lehmann, D., Hendrick, A. M., Regard, M., Pascual-Marqui, R. D., &Davidson, R. J. (2002). Affective judgments of faces modulate early activity(approximately 160 ms) within the fusiform gyri. Neuroimage, 16, 663–677.

Pourtois, G., Dan, E. S., Grandjean, D., Sander, D., & Vuilleumier, P. (2005).Enhanced extrastriate visual response to bandpass spatial frequency filteredfearful faces: time course and topographic evoked-potentials mapping. HumanBrain Mapping, 26, 65–79.

Pourtois, G., Delplanque, S., Michel, C., & Vuilleumier, P. (2008). Beyond conven-tional event-related brain potential (ERP): exploring the time-course of visualemotion processing using topographic and principal component analyses.Brain Topography, 20, 265–277.

Pourtois, G., Grandjean, D., Sander, D., & Vuilleumier, P. (2004). Electrophysiolo-gical correlates of rapid spatial orienting towards fearful faces. Cerebral Cortex,14, 619–633.

Rayner, K. (1998). Eye movements in reading and information processing: 20 yearsof research. Psychological Bulletin, 124, 372–422.

Rigoulot, S., Delplanque, S., Despretz, P., Defoort-Dhellemmes, S., Honore, J., &Sequeira, H. (2008). Peripherally presented emotional scenes: a spatiotem-poral analysis of early ERP responses. Brain Topography, 20(4), 216–223.

Rigoulot, S., D’Hondt, F., Defoort-Dhellemmes, S., Despretz, P., Honore, J., &Sequeira, H. (2011). Fearful faces impact in peripheral vision: behavioral andneural evidence. Neuropsychologia, 49(7), 2013–2021.

Rolls, E. T. (2007). The representation of information about faces in the temporaland frontal lobes. Neuropsychologia, 45, 124–143.

Rossignol, M., Philippot, P., Douilliez, C., Crommelinck, M., & Campanella, S. (2005).The perception of fearful and happy facial expression is modulated by anxiety:an event-related potential study. Neuroscience Letters, 377(2), 115–120.

Rossignol, M., Philippot, P., Crommelinck, M., & Campanella, S. (2008). Visualprocessing of emotional expressions in mixed anxious-depressed subclinicalstate: an event-related potential study on a female sample. NeurophysiologieClinique¼Clinical Neurophysiology, 38(5), 267–275.

Sato, W., Kochiyama, T., Yoshikawa, S., & Matsumura, M. (2001). Emotionalexpression boosts early visual processing of the face: ERP recording and itsdecomposition by independent component analysis. Neuroreport, 12, 709–714.

Page 10: Implicit emotional processing in peripheral vision: Behavioral and neural evidence

S. Rigoulot et al. / Neuropsychologia 50 (2012) 2887–28962896

Schimmack, U., & Derryberry, D. (2005). Attentional interference effects ofemotional pictures: threat, negativity, or arousal?. Emotion, 5, 55–66.

Schupp, H. T., Ohman, A., Junghofer, M., Weike, A. I., Stockburger, J., & Hamm, A. O.(2004). The facilitated processing of threatening faces: an ERP analysis.Emotion, 4, 189–200.

Schyns, P. G., & Oliva, A. (1999). Dr. Angry and Mr. Smile: when categorizationflexibly modifies the perception of faces in rapid visual presentations. Cogni-tion, 69(3), 243–265.

Spencer, K. M., Dien, J., & Donchin, E. (1999). A componential analysis of the ERPelicited by novel events using a dense electrode array. Psychophysiology, 36,409–414.

Spencer, K. M., Dien, J., & Donchin, E. (2001). Spatiotemporal analysis of the lateERP responses to deviant stimuli. Psychophysiology, 38, 343–358.

Stein, T., Zwickel, J., Ritter, J., Kitzmantel, M., & Schneider, W. X. (2009). The effectof fearful faces on the attentional blink is task dependent. Psychonomic BulletinReview, 16, 104–109.

Thorpe, S. J., Gegenfurtner, K. R., Fabre-Thorpe, M., & Bulthoff, H. H. (2001).Detection of animals in natural images using far peripheral vision. EuropeanJournal of Neuroscience, 14, 869–876.

Tottenham, N., Tanaka, J. W., Leon, A. C., McCarry, T., Nurse, M., Hare, T. A., Marcus,D. J., Westerlund, A., Casey, B. J., & Nelson, C. (2009). The NimStim set of facialexpressions: judgments from untrained research participants. PsychiatryResearch, 168, 242–249.

Van Strien, J. W., De Sonneville, L. M., & Franken, I. H. (2010). The late positivepotential and explicit versus implicit processing of facial valence. Neuroreport,21, 656–661.

Vogt, J., De Houwer, J., Koster, E. H., Van Damme, S., & Crombez, G. (2008).Allocation of spatial attention to emotional stimuli depends upon arousal andnot valence. Emotion, 8, 880–885.

Vuilleumier, P., Armony, J. L., Driver, J., & Dolan, R. J. (2001). Effects of attentionand emotion on face processing in the human brain: an event-related fMRI

study. Neuron, 30, 829–841.

Vuilleumier, P., Armony, J. L., Driver, J., & Dolan, R. J. (2003). Distinct spatialfrequency sensitivities for processing faces and emotional expressions. Nature

Neuroscience, 6, 624–631.

Wassle, H., Grunert, U., Rohrenbeck, J., & Boycott, B. B. (1990). Retinal ganglion cell

density and cortical magnification factor in the primate. Vision Research, 30,1897–1911.

Williams, J. M., Mathews, A., & MacLeod, C. (1996). The emotional Stroop task and

psychopathology. Psychological Bulletin, 120, 3–24.

Williams, M. A., McGlone, F., Abbott, D. F., & Mattingley, J. B. (2005). Differential

amygdala responses to happy and fearful facial expressions depend onselective attention. Neuroimage, 2, 417–425.

Yarbus, A. (1967). Eye movements and vision. New York: Plenum Press.