Top Banner
Cite as: Arslanova I, Galvez-Pol A, Calvo-Merino B,& Forster B. (2019). Searching for bodies: ERP evidence for independent somatosensory processing during visual search for body-related information. Neuroimage 21; 195:140-149. doi: 10.1016/j.neuroimage.2019.03.037 Searching for bodies: ERP evidence for independent somatosensory processing during visual search for body-related information I. Arslanova a,b , A. Galvez-Pol a,c , B. Calvo-Merino a , B. Forster a Affiliations: a Cognitive Neuroscience Research Unit, Psychology Department, City, University of London, EC1V 0HB, UK. b Institute of Cognitive Neuroscience, University College London, WC1N 3AZ, UK. c Department of Clinical and Movement Neurosciences, Institute of Neurology, University College London, WC1N 3BG, UK. Authors’ names and e-mail addresses: Irena Arslanova. [email protected] Alejandro Galvez-Pol. [email protected] Beatriz Calvo-Merino. [email protected] Bettina Forster. [email protected]
55

City Research Online · Web viewThe remaining 20 participants (11 women) were all right-handed, aged 18 – 42 years (mean age = 24.9), reported normal or corrected-to-normal visual

Feb 19, 2021

Download

Documents

dariahiddleston
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript

Cite as: Arslanova I, Galvez-Pol A, Calvo-Merino B,& Forster B. (2019). Searching for bodies:

ERP evidence for independent somatosensory processing during visual search for body-related information. Neuroimage 21; 195:140-149. doi: 10.1016/j.neuroimage.2019.03.037

Searching for bodies: ERP evidence for independent somatosensory processing during visual search for body-related information

I. Arslanovaa,b, A. Galvez-Pola,c, B. Calvo-Merinoa, B. Forstera

Affiliations:

aCognitive Neuroscience Research Unit, Psychology Department, City, University of London, EC1V 0HB, UK.

bInstitute of Cognitive Neuroscience, University College London, WC1N 3AZ, UK.

cDepartment of Clinical and Movement Neurosciences, Institute of Neurology, University College London, WC1N 3BG, UK.

Authors’ names and e-mail addresses:

Irena Arslanova. [email protected]

Alejandro Galvez-Pol. [email protected]

Beatriz Calvo-Merino. [email protected]

Bettina Forster. [email protected]

Address correspondence to B. Forster ([email protected]) or I. Arslanova ([email protected]), Cognitive Neuroscience Research Unit, Psychology Department, City, University of London, School of Arts and Social Science, Northampton Square, London EC1V 0HB, UK.

Abstract

Attention allows us to select relevant information by modulating neural activity within sensory brain areas processing that information. Previous research has shown that visual perception of body stimuli recruits visual cortices together with observer’s body representation in somatosensory cortex, which is known for processing body-related information (e.g., haptics, kinematics). However, whether attentional selection of visual body stimuli involves just visual or additional somatosensory areas remains elusive. Here we elicited visual and somatosensory evoked activity during a visual search task, whereby participants searched for target hand images defined by either visual (colour) or bodily (posture) features. In line with previous studies, we found electrophysiological evidence for attentional selection over visual areas (i.e., N2pc) regardless of the feature type. Importantly, after dissociating somatosensory from visual evoked activity, we show that only attentional selection of hand posture - but not hand colour - elicits modulation of somatosensory evoked electrocortical activity over somatosensory cortex. This suggests that attention may not only modulate cortical activity associated with the input-sensory modality (in this case, visual), but, depending on the type of attended information, it may also modulate cortical activity associated with another task-relevant sensory modality (in this case, somatosensory). Overall, our results provide evidence for a flexible attention mechanism that operates according to specific behavioral goals and the information embedded in the percept.

Keywords: body perception, attention, sensory recruitment, visual search, embodiment, somatosensory cortex, SEPs

1. Introduction

Humans live in a highly social world; we are constantly surrounded by other people and often need to selectively direct attention to their bodily features (e.g. gestures) to extract relevant information that helps us guide behavior. Selective visual attention endows us with a powerful tool to focus limited visual resources on such behaviorally relevant information while disregarding the less relevant one. As an example, when learning a novel action we can easily focus on sequences of bodily gestures without being distracted by other less relevant visual aspects (e.g., colour of the agent’s clothing). In this vein, previous studies have shown that when attention is directed to specific task-relevant features amongst competing distractors, neural responses to these former features are enhanced whereas responses to distractors are generally suppressed (see Maunsell & Treue, 2006 for a review). This modulation of bottom-up signals has been found across different modalities (e.g., Bichot, Rossi, & Desimone, 2005; Gamble & Woldorff, 2015; Forster, Tziraki, & Jones, 2016), whereby modulation of neural activity occurs within the relevant sensory-input modality such as within visual areas for visual features, auditory for sounds and somatosensory for tactile information.

While vision is generally the dominant sense through which we perceive others and their bodies, it is now well established that perceiving others’ bodies recruits one’s own body representation in sensorimotor and somatosensory cortices, beyond visual-only regions (Calvo-Merino, Grèzes, Glaser, Passingham, & Haggard, 2006; Galvez-Pol, Calvo-Merino, Capilla, & Forster, 2018a; Galvez-Pol, Forster, & Calvo-Merino, 2018b). Despite many empirical studies showing how modulatory effects of attention on sensory brain regions arise through the operations in fronto-parietal brain network that compromises the intraparietal sulcus and frontal eye fields (Yantis & Serences, 2003; Woldorff, et al., 2004; Corbetta & Shulman, 2002), a detailed understanding of how attentional effects are allocated across sensory cortex remains largely unclear. In particular, we do not yet know whether attentional modulation is dependent only upon the sensory-input modality, or rather the nature of the perceived information. In case of visually perceived bodily stimuli, it is unknown if attention influences cortical activity merely within visual regions that process the visual percept or other sensory (i.e. none visual) regions that are known for representing overall bodily properties (i.e., somatosensory cortex, SCx). Elucidating this matter may provide further understanding of the rapid processing involved when perceiving others’ bodies (Downing, Bray, Rogers, & Childs, 2004; Stein, Sterzer, & Peelen, 2012).

Visual search paradigms have been widely used to investigate the mechanisms of feature-based attentional selection (see Eimer, 2014 for review). In a typical visual search task, participants are required to detect a target item with a known feature but unknown location amidst distractor items that differ on that feature (Luck & Hillyard, 1990, 1994; Eimer, 1996; Woodman & Luck, 1999, 2003; Kiss, Velzen, & Eimer, 2008; Woodman, Arita, & Luck, 2009; Hilimire, Mounts, Parks, & Corballis, 2011). Event-related potential (ERP) techniques have proved to be particularly helpful in revealing how neural processes underlying attention mechanisms during visual search unfold in real time (Luck, Woodman, & Vogel, 2000). Studies employing ERP techniques have demonstrated that attentional selection of a visual target amongst distractors is characterized by a modulation of the N2pc (N2-posterior-contralateral) ERP component around 200 ms after the onset of visual array (Eimer, 1996; Luck & Hillyard, 1990, 1994; Luck & Kappenman, 2011). The N2pc is an enhanced negativity over posterior electrode sites in the hemisphere contralateral to the visual field containing the target (e.g., Eimer, 1996; Luck & Hillyard, 1994) and it has been shown to reflect the enhancement of target processing under conditions of competition from surrounding distractors (e.g., Luck & Hillyard, 1994; Eimer, 1996; Mazza, Turatto, & Caramazza, 2009; Hickey, Di Lollo, & McDonald, 2009). Analogous lateralized components that mark attentional selection in other modalities have also been reported. For instance, studies employing tactile stimuli (Forster et al., 2016; Ambron, Mas-Casadesús, & Gherri, 2018) have distinguished an analogous electrophysiological marker of tactile search, the N140cc (N140-central-contralateral), that indicates enhanced processing of task-relevant touch locations within the somatosensory cortex. These findings show that attentional selection can be tracked by modality specific lateralized components over the relevant sensory cortices.

Given that attentional selection of task-relevant features influences electrocortical activation over relevant sensory areas and that perceiving and memorizing body-related images recruits body-related cortices beyond visual regions (Galvez-Pol et al., 2018a; Galvez-Pol et al., 2018b) we sought to investigate the effects of attention during visual search for body-related features such as body postures. To this aim, we adapted a classical visual search paradigm by asking participants to search for a target hand image that was specified at the beginning of each experimental block among other images of hands. The hands could differ in either colour or posture (Fig. 1b). In half of the blocks, participants were instructed to search for a target hand that depicted a specific posture and ignore the colour (posture task). In the other half, the same hand images were presented but participants based their search only on the colour (colour task). This approach allowed us to use the exact same visual stimuli while manipulating the level of body-relatedness each task conveyed. Throughout the experiment, visual and somatosensory evoked potentials (VEPs, SEPs) were recorded to reveal whether body-relatedness would lead to differential modulation of electrocortical activity. Specifically, we examined whether attentional selection of a visual, bodily feature (i.e., posture) compared to a visual, non-bodily feature (i.e., colour) would only modulate activity within visual regions (as previously described for selection of visual features such as colour) or whether other sensory regions that participate in the representation of body-related information in the brain (i.e., somatosensory cortex, SCx) would also be recruited.

Similar to previous studies of visual search (e.g., Eimer, 1996; Luck & Hillyard, 1990, 1994), we recorded trials containing VEPs elicited by the visual onset of the visual search array (visual-only trials), which allowed us to examine the modulation of activity in occipito-parietal visual regions during attentional selection of target hand images. We expected to find a similar electrophysiological marker of attentional selection (i.e. N2pc component) regardless of whether participants were searching for a purely visual, non-bodily feature (i.e., colour) or a bodily feature (i.e. posture), as both cases would lead to modulation of cortical activity in visual regions. Importantly, we also probed the state of SCx by using task-irrelevant tactile taps that were delivered to the participants’ fingers in synchrony with the onset of visual search array. The resulting activity contained combined brain activity due to visual and somatosensory evoked potentials (visual-tactile trials). Next, by subtracting brain activity of the visual-only trials from the compound activity of the visual-tactile trials (Sel, Forster, & Calvo-Merino, 2014; Galvez-Pol et al., 2018a), we examined visually-driven SCx processing, over and above visual carry-over effects. We predicted that once isolated from carry-over visually-elicited activity, somatosensory evoked potentials (SEPs) would exhibit a comparable electrophysiological marker of attentional selection as has been reported for tactile search (Forster et al., 2016; Ambron, Mas-Casadesús, & Gherri, 2018). Furthermore, we expected this attentional modulation only when participants were attending to body-related information (posture task) and at centro-parietal electrode sites where attentional modulations of somatosensory ERPs are generally observed (Forster & Eimer, 2005; Jones & Forster, 2014; Forster et al., 2016). Ultimately, this paradigm allows identifying a neural marker of attentional selection of body-related information that modulates activity beyond visual sensory cortices. In the following, we provide novel evidence that mechanisms of attentional selection of body-related information might not exclusively act upon the original sensory modality but rather on the relevant areas that are specialised for processing bodily stimuli.

2. Materials and Methods

2.1. Participants

Twenty-nine volunteers, naïve to the objective of the experiment participated for payment. The data of seven participants were excluded due to excess muscle and alpha wave activity leading to low trial numbers after ERP artefact rejection (less than 60% of the total number of trials in any condition) an additional three participants were excluded due to excessive eye-movements (see below for criteria). The remaining 20 participants (11 women) were all right-handed, aged 18 – 42 years (mean age = 24.9), reported normal or corrected-to-normal visual acuity and no colour blindness. All participants gave informed consent, with approval by the City, University of London Psychology Department Research Ethics Committee.

2.2. Visual and tactile stimuli

For the visual search task, two hand images depicting two different hand postures (body-related features) that did not convey any meaning or symbolism were used (subset of images from Galvez-Pol et al., 2018a; Galvez-Pol et al., 2018b). Each posture was coloured in two distinct colours (non-body-related features), yielding four hand stimuli shown in Figure 1b. The hand colour was achieved by adjusting the colour levels of the originally grey hand images (RGB: 100 160 130 and 100 160 100). All hand images were of equal size (66 x 76 pixels).

On half of the trials, in addition to the visual search array, participants received a brief task-irrelevant tactile stimulation concurrently to the index fingers of both hands in order to elicit somatosensory evoked activity. Tactile stimulation was a brief single tactile tap applied at the onset of visual search array using two 12 V solenoids (5 mm in diameter). When a current passed through the solenoids, an attached metal rod with a blunt conical tip contacted with participants’ skin. Both solenoids were attached with microporous tape to the tip of the index fingers, one for each hand.

2.3. Experimental design and procedure

Participants were seated in an electromagnetically shielded, sound attenuated dimly lit room, viewing a 60 Hz computer monitor at a viewing distance of 80 cm. Participants performed two visual search tasks similar to that of Luck and Hillyard (1994). In both tasks participants were presented with identical arrays of four hand images and they had to discriminate whether a predefined target hand was present. The target was presented on half of the trials in order to keep the probabilities of target-present vs. target-absent trials equal and ensure that participants had to engage in visual search. At the beginning of each task, participants were presented with an image of the target hand. In the posture task, participants were instructed to search for the hand that had the same posture as the target hand and ignore the colour, whereas in the colour task, they were asked to search for the hand that was of same colour as the target and ignore the posture. All participants performed both tasks, with the order of tasks being counterbalanced across participants (either the hand task first and the colour task second, or vice versa). They were instructed to say ‘yes’ if the hand with the target feature was present among the other four hand images and ‘no’ when it was absent. Each task consisted of two separate blocks with each of the two postures and two colours serving as the target in one of the blocks.

On each trial, participants were presented with a bilateral visual search array depicting two hands located in each hemifield (Figure 1a). A central fixation cross was present throughout the trials in the center of the visual array which participants were instructed to fixate on. The array could either consist of four identical hand images (no-target trials; 50% of trials) or contain one hand in different posture and another hand in different colour (target trials; 50% of trials). Participants were required to verbally report whether or not the hand arrangement contained the target item. The onset of their response was registered by a voice key, while their response choice (‘yes’/’no’) was keyed in by the experimenter in the adjacent room. Once their response was keyed in, the next trial started.

Visual stimuli were displayed using E-Prime2 Software (Psychology Software Tools, Pittsburgh, PA). All stimuli were presented within two rectangular regions of 4.5o x 8.5o of visual angle that were centred 4.5o to the left and right of a central fixation cross on a light grey background. Each visual search array consisted of two hands (1.6o x 1.7o) in each hemifield. Position of all stimuli was fixed, but the location of the target (in target-present trials) was randomized on each trial, so that it appeared equally often in each of the four hand locations.

On half of all trials only VEPs were elicited in response to the onset of the visual search array at the start of each trial (i.e. visual-only trials). On the other half of the trials VEPs and SEP’s were elicited simultaneously by applying task-irrelevant single tactile probes concurrently to the participants’ index fingers of both hands (i.e. visual-tactile trials). Tactile stimulation was applied at the onset of visual search array using two 12 V solenoids driving a metal rod with a blunt conical tip that contacted with participants’ skin when a current passed through the solenoids. Both solenoids were attached with microporous tape to the tip of the index fingers, one for each hand. To mask the sounds made by the tactile stimulators, white noise (65 dB, measured from participants’ head) was presented through a loudspeaker centrally positioned 90 cm in front of the participants. Participants were instructed to ignore the tactile stimulations and the white noise.

2.4. ERP subtraction

Our main research goal was to examine how attentional selection of visually perceived body related vs. non-body-related features modulates activity of electrodes over SCx. However, measuring ERPs from electrodes positioned over somatosensory areas is problematic, as the activity elicited by the visual-evoked potentials (VEPs) at the onset of the body stimuli spreads over the whole cortex masking other brain processes (Ahlfors et al., 2010; Irimia, Van Horn, & Halgren, 2012; Luck, 2014); potentially masking the somatosensory processing of body postures. Consequently, while VEPs over occipital electrode sites reveal neural processes that are associated with visual processing, they do not allow direct inspection of the response in SCx. In order to uncover the involvement of SCx in attentional selection of body- and non-body-related features, it was necessary to dissociate visually driven somatosensory activity from carry-over visual responses. To do so, we applied mechanical stimulation in the form of tactile probes (visual-tactile trials). These stimulations elicited SEPs that allowed us to examine the state of the SCx and its underlying processing during attentional selection of visual stimuli. It is important to note that tactile stimulation was task-irrelevant and the elicited SEPs did not provide any information about visual stimuli and had no cognitive relevance. The sole purpose of SEPs was to isolate somatosensory processing over corresponding parietal electrode sites from superimposing visual activity elicited by the visual onset of the stimuli. For this purpose, we subtracted brain activity of trials that contained activity only due to VEPs (visual-only trials) from trials that contained a combination of visual and somatosensory activity due to the combined VEPs-SEPs (visual-tactile trials) (Fig. 1c). This method allows examining somatosensory processing (SEPs) free of visually evoked activity (VEP-free), that is, visually driven activity in the form of VEP-free SEPs (Galvez-Pol et al., 2018a; Sel et al., 2014).

Overall, participants performed a total of 1024 trials, 512 for each task (posture and colour task) with a target being present on 256 trials. This equals 128 trials, in which the target was displayed in the left hemifield, and 128 trials where it was displayed in the right hemifield. For each hemifield, half of the trials were visual-only (eliciting only VEPs as no tactile stimulation) and other half visual-tactile trials (eliciting VEPs and SEPs as visual with task-irrelevant tactile stimulation).

2.5. EEG recording and data analysis

Event-related potentials were recorded from 64 Ag/AgCL active electrodes mounted equidistantly on an elastic cap at standard locations of the international 10-10 system (M10 montage; EasyCap GmBH, Herrsching, Germany). Electrodes were referenced to the right mastoid and re-referenced off-line to the average of the left and the right mastoids (Nunez, 1981). To track horizontal eye movements, the horizontal electrooculogram (HEOG) was recorded by placing two electrodes about 1 cm lateral to the external canthi of each eye. Continuous EEG was recorded using a BrainAmp amplifier (BrainProducts; amplifier bandpass 0.06–100Hz) and a 500 Hz sampling rate. Off-line, EEG analysis was performed using Vision Analyzer software (Brain Products GmbH, Gilching, Germany). The data was digitally low-pass-filtered at 30 Hz (Butterworth zero phase filters). The EEG signal was epoched into segments lasting from 100 ms before to 500 ms after the onset of sample arrays of each trial. Segments were then baseline corrected to the first 100 ms. Trials with incorrect behavioural responses or ocular (HEOG exceeding ± 60 mV) or myogenic (voltage exceeding ± 100 mV at any electrode relative to baseline) artifacts were excluded from the analysis. The electric signal of two participants was spline interpolated (order 4, degree 10 at one electrode midway between C1/CP1 and T8 for one participant, and electrode P8 for another). Because we investigated lateralized ERP components, it was important that participants kept their gaze on the fixation at the centre of the visual search array and did not have significant eye movements. For that reason, we computed the average HEOG amplitudes for left and right targets separately and rejected three participants, whose HEOG left-right difference from 0 to 350 ms was larger than 3 mV (Luck, 2014). On average, for each condition 5% of trials were removed due to incorrect responses and 33% of the data was removed because of the artefacts (see Supplementary Table 1 for the total number of accepted trials included in the analyses after the artefact rejection). Grand averages were computed separately for the tasks (posture and colour) and for visual-only and visual-tactile trials by averaging brain waveforms elicited at electrodes over the hemispheres contralateral and ipsilateral to the visual field side in which target hand image were presented. ERPs elicited on non-target trials were not further analysed, as no lateralised targets were present.

For visual-only trials grand averages were computed from the onset of the visual display. To confirm visual attentional selection mechanisms, N2pc activity was measured as the amplitude difference between occipital- parietal electrode sites contralateral and ipsilateral to the hemifield of the target in the 200-300 ms time window following the onset of the visual search array (Luck & Hillyard, 1994a and 1994b; Eimer, 1996; Luck & Kappenman, 2011; Woodman et al., 2009; Drew & Vogel, 2008). We expected comparable N2pc components for both tasks; hence, to ensure that this component is present in both tasks separate statistical analyses were conducted. For these mean amplitudes were computed from visual-only trials at occipital electrode site O1/O2 (corresponding to electrodes 44/42 of the M10 equidistant placement used in this study) and midway between P7/P8 and PO7/PO8 (electrodes 45/41) ipsi- and contralateral to the target side. Repeated-measures ANOVAs were conducted with factors Hemisphere (electrodes contralateral vs. ipsilateral to the target side), and Electrode site (O1/O2 vs. P7-PO7/P8-PO8). A main effect of hemisphere would confirm reliable hemispheric difference, that is, the presence of the N2pc.

For visual-tactile trials grand averages were also computed from the onset of the visual display and concurrent task-irrelevant tactile stimulation separately for each task. To investigate attentional modulation over SCx, somatosensory processing was dissociated from the visual carry over activity elicited by the onset of visual display. Mean voltage amplitudes of grand averaged VEPs on visual-only trials were subtracted from the mean amplitudes of grand averaged ERPs on visual-tactile trials that contained both somatosensory and visual-evoked activity (Galvez-Pol et al., 2018a; Sel et al., 2014). The underlying activity of SCx when attending visually perceived targets was analysed over central and parietal electrode sites at C3/4, C5/6, CP3/4, midway between CP5/CP6 and P5/P6 of the 10/20 system (corresponding to electrodes 17/11, 31/24, 16/12, 30/25 of the M10 montage used in this study). These electrodes were chosen because their laterality over central and posterior sites where early somatosensory ERP components (P45, N80, P100, N140) are largest and tactile attentional modulations are commonly reported (e.g., Forster & Eimer, 2005; Jones & Forster, 2014; Forster et al., 2016). Importantly, while the VEPs elicited on visual-only trials were used to examine the brain activity of visual areas, the VEP-free SEPs were used to investigate brain activity in somatosensory cortices independent of carry-over visual effects. Lateralized effects of attention were computed as the difference between homologous central and parietal electrode sites, contralateral and ipsilateral to the hemifield of the target in the same time window as the N2pc component (200 – 300 ms after stimuli onset). Repeated measures ANOVA was conducted on VEP-free SEP mean amplitude values with factors Task (posture vs. colour), Hemisphere (electrodes contralateral vs. ipsilateral to the target side), ROI (parietal electrode pair vs. central electrode pair), and Electrode site (C3/4 vs C5/6 vs CP3/4 vs CP5-P5/CP6-P6). A significant Hemisphere main effect would suggest a reliable lateralized effect over somatosensory areas. A significant Task by Hemisphere interaction would indicate that this lateralized effect is modulated depending on the level of body relatedness the task conveyed. Specifically, we expected lateralized attention effect to be present when participants were instructed to attend to hand postures but not when they were attending to the colour of the same hands. This result would suggest that visually perceived body-related information such as hand postures exhibit a functionally distinct attention effect that involves somatosensory processing.

Figure 1. Example trial, example of visual stimuli, schematic illustration of subtractive methodology using SEPs and VEPs, and behavioural data. (A) Illustration of experimental procedure for target-present trials. Participants performed visual tasks searching for a target hand defined by either colour or posture. The target feature was indicated at the beginning of each block. Half of the trials included task-irrelevant tactile stimulation delivered to both index fingertips (yellow triangles) at the onset of the visual search array. The dots highlighted in yellow on the mannequin’s head indicate the electrode sites included in the analyses of SEPs whereas the dots highlighted in pink indicate the sites analysed for VEPs. (B) Visual Stimuli. Hand images depict two postures (A & B) that varied in colour (Green & Blue). (C) Schema of the subtraction methodology employed to isolate SCx activity from visual carry-over effects (Sel et al., 2014; Galvez-Pol et al., 2018a). The visual-tactile condition (50% of trials) comprised VEPs elicited at the onset of the visual array and simultaneously elicited somatosensory evoked potentials (SEPs) by task-irrelevant tactile stimulation to the hands (left section). The visual-only condition (50% of trials) consisted of only VEPs (right section) elicited at the onset of the visual array. The subtraction illustrated at the bottom [visual-tactile condition]-[visual-only condition] allows dissociating SCx activity from the concurrent visual carry over effects. (D) Behavioral results for each task (colour task in dark grey and posture task in light grey). Left graph shows mean accuracy of correctly identified target presence; Right graph shows average response times (in ms) to identify whether a hand with a target feature was present. Grey dots are single participants’ mean accuracy / response time and violin plots represent the distribution density of these mean responses. Black dots are group mean responses with error bars representing SEM.

2.6. ERP signal-to-noise ratio

The nature of the ERP subtraction methodology used here raises a possibility that differences in the signal-to-noise ratio between visual-tactile and visual-only trial types could bias the results. To rule out this possibility, we examined the number of accepted trials in visual-only and visual-tactile stimulation conditions separately for posture and colour task. Paired t-tests showed no significant difference in the number of accepted trials between visual-tactile and visual-only trial types neither in colour (t19 = 1.16, p = .66, d = .10) nor in posture task (t19 = 0.43, p = .26, d = .26). These analyses confirm that the signal-to-noise ratio was not different between stimulation conditions and this could not bias the results in later subtraction (i.e., visual-tactile minus visual-only) in either task.

3. Results

In the following we first report participants’ behavioural performance measures in the colour and posture search tasks. Subsequent we report ERP analyses which first states analysis of visual-only trials to confirm the presence of the marker of visual attentional selection, the N2pc, in both tasks. This is followed by analyses of VEP-free SEPs to investigate involvement of SCx in attentional selection mechanisms in either task. Both ERP analyses sections are supplemented with further analysis to explore any additional attention effects beyond the time range of the N2pc component.

3.1. Behavioral performance

Accuracy and response speed for correct detection of target items is shown in Fig. 1B. Separate analyses were conducted for accuracy and response times (RTs) with repeated measures ANOVAs with factors Task (colour versus posture) and Trial type (visual-tactile versus visual-only). Responses outliers (> 2000 ms or < 200 ms) were excluded from analysis (less than 1% of all trials). Statistical analyses of response accuracy showed no main effect of Task (F1, 19 = .13, p = .72, ηp2 = .007) or Trial type (F1, 19 = 3.42, p = .08, ηp2 = .15), or their interaction (F1, 19 = .01, p = .93, ηp2 < .001). Importantly, these results confirm that overall task difficulty matched across tasks (colour: 96% correct, SD = 3% and posture: 95% correct, SD = 2%). Statistical analysis of RTs yielded a significant main effect of Task (F1, 19 = 8.43, p = .009, ηp2 = .31) with faster average response times in the colour (M = 679 ms, SD = 104 ms) than in the posture (M = 736 ms, SD = 97 ms) task. In addition, there was a significant main effect of Trial type (F1, 19 = 13.02, p = .002, ηp2 = .41) indicating that participants were on average faster when receiving tactile stimulation (M = 703 ms, SD = 92 ms) than when responding without tactile stimulation (M = 712 ms, SD = 90 ms). Importantly, there was no significant interaction between Task and Trial type (F1, 19 = 1.17, p = .29, ηp2 = .06), indicating that any effect of tactile stimulation in visual-tactile trials was consistent across the tasks and thus did not influence visual search performance.

3.2. Analyses of visual-only trials: attentional selection (i.e. N2pc) on visual-evoked potentials

We expected the presence of the N2pc over occipital electrode sites in both tasks in the 200-300 ms time window after the onset of visual search array (Luck & Hillyard, 1994a,b; Woodman et al., 2009; Drew & Vogel, 2008; Eimer, 1996). The magnitude of N2pc is quantified by contrasting mean amplitudes over the hemisphere contralateral and ipsilateral to the hemifield containing the target stimulus. Indeed, a lateralised attention effect with more negative ERP amplitudes elicited over the hemisphere contralateral compared to ipsilateral to the target location is present from around 200 ms until around 300 ms after the onset of visual search display in both tasks (see Figure 2). To establish that N2pcs are reliably present in both tasks mean amplitudes were submitted to repeated measures ANOVA separately for each task with factors Hemisphere (contralateral versus ipsilateral to target side) and Electrode site (P7-PO7 / P8-PO8 vs O1/O2).1

FOOTNOTE ABOUT HERE

1 We also ran the same analysis including the factor Task (colour vs posture) and, as expected, did not find any main effect or interaction including task (all p ≥ 0.07) suggesting no significant amplitude differences between the N2pc in each of the tasks.

Significant main effects of Hemisphere for both tasks (colour: F1, 19 = 16.96, p < .001, ηp2 = 0.47; posture: F1, 19 = 52.58, p < .001, ηp2 = 0.74) confirm the presence of reliable N2pc components. In the posture task, there was also a significant interaction between Hemisphere and Electrode site (F1, 19 = 5.82, p < .03, ηp2 = 0.23), yet the effect of Hemisphere reached significance across both electrode sites (all p < .001). Furthermore, we directly contrasted the hemispheric difference (amplitude over the hemisphere contralateral minus ipsilateral to the target hemifield) in each task with a paired-sample t-test confirming similar magnitude of the N2pc components (t19 = 1.30, p = .21, d = .29). Overall, the present results suggest that both tasks exhibit a reliable N2pc component over visual areas, confirming engagement of rapid attentional mechanisms for the selection of target hand images defined by either colour or posture.

3.2.1. Exploratory analyses of visual-only trials: post-N2pc modulation of visual-evoked potentials

As seen from Fig. 2, visual-evoked activity in posture task seems to exhibit a prolonged N2pc component. Therefore, we explored the neural response in the succeeding time window of 300-400 ms. Brain activity in the 300-400 ms and later time range has been associated with maintenance of the lateralized target in visual working memory for in-depth processing (e.g., Vogel & Machizawa, 2004; Dell’Acqua, & Robitaille, 2006a, 2006b; Jolicoeur, Sessa, McCollough, Machizawa, & Vogel, 2007; Kiss, van Velzen, & Eimer, 2008). In the posture task, the effect of Hemisphere in the 300-400 ms window reached significance (F1, 19 = 6.55, p = .02, ηp2 = 0.26), whereas no such effect was present in the colour task (F1, 19 = 0.93, p = .35, ηp2 = 0.05). In addition, a paired-sample t-test comparing the lateralized effect between colour and posture tasks was also significant (t19 = -2.08, p = .05, d = -.47) with greater hemispheric difference (M = 0.29µV, SD= 0.5µV) in the posture compared to the colour task (M= -0.23µV, SD= 1.05µV). These results suggest that a continuation of lateralized attention effects was present only when attending to specific hand posture, but not when attending to hand colours.

Figure 2. Grand averaged VEPs and topographic maps on visual-only trials separate for the posture (top) and colour (bottom) task. (A) ERP waveforms show visual-evoked potentials (VEPs) contralateral and ipsilateral to the target side in response to the onset of the visual stimuli pooled across occipital electrode sites (midway between P7/P8 and PO7/PO8 and O1/O2) for the posture and colour task. (B) Topographic maps show amplitude differences at homologous electrodes over the hemisphere contra- and ipsilateral to the target side in the 200-300 ms time window.

3.3. Analyses of VEP-free SEPs: effect of attentional selection on somatosensory areas

To investigate whether directing attention to different features of hand images leads to additional modulation of somatosensory activity in the critical 200 – 300 ms time window we examined SEPs elicited by task-irrelevant tactile probes concurrently to the visual display. Importantly, we isolated somatosensory processing from concomitant visual activity by subtracting the mean amplitude of purely visually evoked activity (VEPs elicited on visual-only trails) from the mean amplitude containing both visual and tactually probed somatosensory activity (VEPs and SEPs elicited on visual-tactile trials). This subtraction method allows the possibility of examining visually driven processing of information in cortices other than visual areas, specifically over SCx (Galvez-Pol et al., 2018a; Sel et al., 2014). If attentional selection of visually depicted body-related information recruits also somatosensory processing, then tactually probed SCx response should be differentially affected depending on whether people were instructed to discriminate body-related (posture) or non-body related (colour) features of the hand images.

After subtraction of visual-evoked potentials, we first inspected the lateralized effect over the same occipital electrode sites that reached significance in the previous analyses of visual-only trials. No significant lateralized effect was present either in the colour (F1, 19 = 0.05, p = .82, ηp2 = .003) nor posture (F1, 19 = 1.09, p = .31, ηp2 = 0.05) task, indicating that visual activity was successfully subtracted.

Then we proceeded to examine the isolated somatosensory ERP waveforms in response to target trials elicited over SCx in the hemispheres contralateral and ipsilateral to (visual) target hemifield. An enhanced negativity over the hemisphere contralateral to the target side (starting from 200 ms and lasting until 300 ms) was present only when participants were searching for targets defined by a specific posture (Figure 3). To denote its more centro-parietal location (see topographic maps in Figure 3), this component was named N2 centro-parietal contralateral (N2cpc). In contrast, when participants were viewing the same hand stimuli, but searching for a specific colour, purely somatosensory activity did not seem to exhibit any reliable lateralized attention effects. To confirm these informal observations, we ran a repeated measures ANOVA with factors Task (colour vs. posture), Hemisphere (contralateral vs. ipsilateral), ROIs (central vs. parietal), and Electrode site (C3/4 vs. C5/6 vs. CP3/4 vs. midway between CP5/CP6 and P5/P6) which showed a significant interaction between all factors (F1, 19 = 4.89, p = .04, ηp2 = .21). To follow-up this interaction separate analyses for each task were conducted. In the posture task a significant effect of Hemisphere was present (F1, 19 = 6.94, p = .02, ηp2 = .27) confirming an enhanced negativity over the contralateral hemisphere. Crucially, such a lateralized effect was absent in the colour task (F1, 19 = .22, p = .64, ηp2 = .01). Although there was a significant interaction between Hemisphere, ROI, and Electrode in the colour task (F1, 19 = 6.89, p = .02, ηp2 = .27), a follow-up analysis across ROI’s did not reveal any effects of Hemisphere (F1, 19 = 0.083, p = .777, ηp2 = .004 for central electrodes, and F1, 19 = 0.32, p = .58, ηp2 = .02 for parietal electrodes) nor any interactions with Hemisphere (F1, 19 = 2.80, p = .11, ηp2 = .13 and F1, 19 = 2.60, p = .12, ηp2 = .12, for central and parietal electrode pairs respectively). These analyses confirm that only when searching for hand targets defined by bodily (posture) but not visual (colour) features, attentional selection modulates somatosensory activity. Therefore, the topography and neural signature of attentional selection processes reflect the type of information embedded in the percept (i.e., somatosensory cortex for the attentional selection of body-related information).

Figure 3. Grand averaged VEP-free SEPs and topographic maps generated by subtracting visual-only trials from visual-tactile trials. (A) ERP waveforms show potentials elicited at electrodes contralateral and ipsilateral to the target side at parietal and central electrode sites (C3/4, C5/6, CP3/4, and midway between CP5/CP6 and C5/C6) separate for the posture (top) and colour (bottom) task. (B) Topographies show VEP-free SEPs amplitude differences between homologous electrode sites contra- minus ipsilateral to the target side in the 200-300 ms time window for each task.

3.3.1 Additional exploratory analyses of VEP-free SEPs: analyses prior to the N2cpc and link to behaviour

The grand average waveforms of somatosensory-evoked activity suggest additional earlier lateralized effects that unfold before the time range classically associated to the N2pc (see Figure 4). Therefore, we assessed the VEP-free SEP amplitude differences between contra- and ipsilateral activity in parietal and central electrode sites for additional time windows preceding the N2cpc (30-50 ms, 50-100 ms, 100-150 ms, and 150-200 ms after stimuli onset). For each of these time windows exploratory, repeated measures ANOVA with the same factors as the above main analysis was conducted and p-values were Bonferroni corrected for multiple comparisons. In particular, there was neither an effect of Task (all F1, 19 < 7.10, p > .08, ηp2 < .27) or Hemisphere (all F1, 19 < 4.81, p > .16, ηp2 < .20) or any other main effects (ROI or Electrode) or interactions involving Task or Hemisphere which would indicate lateralized attention effect in either task prior to the N2cpc.

To investigate the relationship between ERP markers of selective attention and reaction times, we correlated N2cpc amplitudes and reaction times of correctly detected hand posture targets. This analysis did not reveal a significant correlation (r = .30, p = .20). Likewise, we did not find any significant correlation between the visual N2pc and reaction time to detect colour (r = .18, p = .46) or posture (r = .35, p = .13) targets.

4. Discussion

The aim of the present study was to examine the modulation of brain activity within visual and somatosensory areas during attentional search of visually perceived body images. Participants performed a visual task while searching for target hand images defined by either visual non-body-related features (hand colours) or, in separate task, visual body-related features (hand postures). Importantly, the stimuli were identical in both of these tasks and the level of body-relatedness was manipulated with instructions only (attend to the posture or attend to the colour). Yet, only attentional search of hand postures led to independent, visually-driven modulation of somatosensory processing. In particular, searching for target postures induced enhanced contralateral negativity in pure somatosensory-evoked activity, whereas no such modulation was present when searching for target colour of the same hand images. Interestingly, different tasks did not induce any significant differences in the visual processing; N2pc components were present in visual-evoked activity for both tasks. These results suggest that in both tasks the target hands were processed to a high level within visual areas, however only attentional selection of body-related information modulated activity within SCx. In other words, our results show a neural signature of attentional selection of body-related information beyond visual cortices (namely visually driven N2cpc).

The subtraction of visual-only trials from visual-tactile trials allowed us to dissociate neural responses evoked in SCx during visual processing, over and above a potential carry-over of activity from VEPs. The subtractive method on which this work is based has been previously employed in studies examining multisensory integration (Dell’Acqua, Jolicoeur, Pesciarelli, Job, & Palomba, 2003; Teder-Sälejärvi, McDonald, Di Russo, & Hillyard, 2002; Bernasconi et al., 2018). For example, Bernasconi et al. (2018) employed similar subtraction method with auditory and somatosensory evoked activity during ECOG recordings to identify neural underpinnings of audio-tactile peripersonal space. However, only recently it has been used to show visually driven but visually independent SCx activity (Galvez-Pol et al., 2018a; Sel et al. 2014). Importantly, the method employed here, and in other related studies (Galvez-Pol et al., 2018a; Sel et al. 2014), is considerably different from subtractive method used in multisensory integration studies. First, multisensory integration is assessed by computing the difference in cortical activity between brain responses to the combined condition (e.g., trials where auditory and tactile stimuli is presented simultaneously) and the sum of the unimodal conditions (e.g., audio-visual – (audio-only + tactile-only)). This is substantially different from the subtraction performed in the present study, where brain response elicited by solely visual stimuli is subtracted from brain response elicited by simultaneous presentation of visual and tactile stimuli (e.g., visual-tactile – visual-only). In the latter case only one operation is used to isolate somatosensory evoked brain activity from concurrent visual activation during the attentional selection of target hand images. Secondly, in the present subtractive method tactile stimulation per se do not provide any information about visual input. Its sole purpose is to act as impulses to reveal the processing of body-related information by somatosensory cortices that would be otherwise concealed by concurrent visual processing. This approach allowed us to infer that attentional selection of body images and modulation of somatosensory activity are not mere carryover effects from concomitant activation in visual cortices.

The main finding of this study is that attention mechanisms employed during visual search of body-related information operate by modulating sensory activity within somatosensory areas. These areas are specialised in processing bodily information. This effect was not found when participants searched for a visual feature (colour) embedded in the images of hands. These findings can be explained in terms of the conceptual difference between processing body-related vs. non-body-related visual information (i.e., body-relatedness). Specifically, body-related information, such as visually perceived postures, conveys both the information about their visual properties as well as information relevant to the body itself (Azañón & Haggard, 2009). In contrast, purely visual information, such as colour, does not generally reflect body related information to the same extent. This process seems to be reflected in the fact that bodily percepts are mapped onto one’s own internal body representation (Keysers, Kaas, & Gazzola, 2010; Niedenthal, 2007). Numerous neuroimaging and TMS studies have suggested that SCx retains a mental representation of one’s own body and also participates during perception of body stimuli (Urgesi, Calvo-Merino, Haggard, & Aglioti, 2007; Gazzola & Keysers, 2009; Tsakiris, 2010; Bolognini, Rossetti, Maravita, & Miniussi, 2011; Martuzzi, van der Zwaag, Farthouat, Gruetter, & Blanke, 2014). Interestingly, the results of a recent electrophysiological study (Galvez-Pol et al., 2018a) that employed similar subtractive methodology to the one used in the present study, showed the envolvement of somatosensory regions during active maintenance of visually perceived body-related information in working memory, highlighting the importance of our own internal body representation when perceiving others’ bodies (Galvez-Pol et al., 2018b; Calvo-Merino et al., 2006; Urgesi et al., 2007; Sel et al., 2014). Crucially, our present study shows that attentional selection also modulates SCx when searching for visually perceived body-related information.

The finding that attention modulates activity in sensory areas other than visual is not new. Previous research has shown that attention influences activity in somatosensory cortices when searching for tactile targets (Forster et al., 2016; Ambron et al., 2018) as well as auditory cortices when searching for auditory targets (Gamble & Luck, 2011; Gamble & Woldorff, 2014; Gamble & Woldorff, 2015). Despite the fact that these studies showed modulation in modality-specific cortices, it has remained unclear if the effects of attentional selection are defined solely by sensory-input modality or also by the specific content of the target information. The present study suggests that attention may not merely modulate activity within modality-specific areas, but also in task-relevant ones, by showing visually driven but independent somatosensory modulation for bodily information. In other words, attentional selection acts on the relevant sensory areas usually processing the content of the target percept, independent of the sensory channel that triggered the representation of that content. As a consequence, when observers’ goal is to extract body-related information (i.e., posture) as opposed to non-body-related information (i.e., colour), even if the visual percept itself is the same in both cases, attention acts on the sensory areas (i.e., SCx) that represent body-related information.

Whether attentional modulation of independent somatosensory activity facilitates attentional selection of body-related information remains unclear. In the present study, we did not find significant linear correlations between reaction times to detect bodily-defined targets and the extent of modulation over SCx (see exploratory analyses). In order to examine the functional significance of N2cpc for attentional selection of bodily features, further studies could either investigate effects of familiarity with the body-related content on N2cpc amplitudes or directly modulate somatosensory processing (e.g. using TMS) to induce causal changes in behaviour (that is, searching for targets defined by body-related features). The present study suggests that attentional selection is a dynamic process moderated by the nature of the information embedded in the visual percept. This is reflected in the modulation of somatosensory activity, which underpins the processing of bodily information (e.g., interacting with others’ bodies and using my own body) that can be found in the visual percept.

Interestingly, we observe a prolonged N2pc component when participants were searching for the target hand posture but not when they were searching for the target colour. Here we ponder two potential explanations for this observation. Firstly, a prolonged N2pc might be related to more variable response times to detect posture-defined targets versus colour-defined targets. If reaction times were highly variable, the onset of the N2pc component presumably varied widely from trial to trial, yielding a broad component when the data is averaged together (Wolber & Wascher, 2003; Luck & Kappenman, 2011). In other words, the N2pc will appear to have a long duration when trials with different N2pc onset times are averaged together. Despite slower reaction times in the posture task, the variability across participants was not higher for posture compared to colour task (SD= 97ms versus SD=104ms, respectively). Secondly, the prolonged N2pc in posture task might reflect the sustained posterior contralateral negativity (SPCN), which has been linked to maintenance of an attended item in working memory (e.g., Vogel & Machizawa, 2004; Jolicoeur et al., 2006a, 2006b; McCollough et al., 2007; Kiss et al. 2008), as well as the discrimination of tasks with no direct memory component (SPCN amplitude is increased depending on the difficulty of the discrimination) (Mazza, Turatto, Umiltà, & Eimer, 2007; Prime & Jolicoeur, 2010). However, in our study task demands were matched as supported by overall similarly high accuracy levels. Furthermore, a recent study (Sessa et al. 2018) has showed modulation of the SPCN when memorizing emotional faces and that this modulation depended on the observers own facial expression. Therefore, a more likely explanation for the prolonged N2pc in the posture task may be due to memory processes associated with body-related stimuli. The prolonged N2pc in the posture task does not necessarily imply increased task difficulty but rather reflect neuronal operations linked to body-related working memory that are distinct in the two tasks.

In the colour task participants were required to indicate the presence of a certain colour tint of one of the hand images in the search display. One could employ visual cues such as skin colour or gender as ‘purely visual’ equivalent to our posture task. However, such cues are also strongly related to one’s body and could potentially recruit one’s own body representation in somatosensory cortices similarly to hand postures. For that reason, the present study employed visual features that were completely dissociable from the body (i.e., blue/greenish colours) to clearly distinguish between visual and bodily features. We found that reaction times to detect colour defined targets were shorter than those to detect posture defined targets. Previous studies have shown that colour is a specifically salient pop-out that often grabs attention more rapidly than other features such as shape (e.g. Luck and Hillyard, 1994). Similar to our findings Luck and Hillyard (1994) reported faster reaction times in a colour search task and no task differences in N2pc amplitudes but in N2pc latencies. Further, Clark et al. (2015) specifically trained participants in a visual search task and found decreased reaction times as well as greater N2pc amplitudes. Therefore, future studies may elucidate the link between response times and N2pc amplitudes and latencies. Important to our study, N2pc amplitudes over visual areas were not significantly different across tasks, despite decreased reaction times to detect colour targets suggesting the presence of a common, visual attention mechanisms in both tasks.

5. Conclusion

In line with previous studies of visual search, we show that the N2pc, a neural marker for attentional selection, is elicited when searching for hand images differing in colour or posture. Furthermore, the method employed here allowed us to tactually probe the state of the SCx and its subsequent changes in activity during a visual search of hand images. The results showed that only when attention is directed to bodily (postures), but not purely visual (colour) features of the same hand images, somatosensory activity is modulated. Specifically, focusing on hand postures results in enhanced negativity over somatosensory regions in the hemisphere contralateral to target location (N2cpc component), indicating selective modulation of SCx during attentional selection of body-related information. No such effect occurred when attention was directed to visual properties of the hands. These results may indicate a distinctive role of somatosensory cortices in underlying attentional selection of body-related information independently from initial visual processing. Taken together, the current study supports the notion that attention mechanisms operate depending on the nature of extracted information, recruiting brain areas that usually represent functional properties of that information. Last, we believe that these results contribute to the embodiment and action perception frameworks by extending the role of SCxs to attentional selection of body-related information, as well as to the attention field by enriching our understanding of the processing of multifaceted stimuli.

Author contributions

I.A, A.G-P, B.C-M, and B.F designed the research; I.A collected the data; I.A and B.F analysed and interpreted the data; I.A and B.F wrote drafts of the manuscript, and A.G-P, and B.C-M provided critical comments on the paper.

Ethics

Human subjects: Ethical approval for methods and procedures was obtained from the City, University of London Psychology Department’s Research Ethics Committee. All participants provided written, informed consent.

Acknowledgments

The authors declare no competing financial interests. This research was supported by an Undergraduate Research Bursary from the Experimental Psychology Society (IA and BF) and a City, University of London, PhD scholarship (AG-P).

Supplementary data to this article can be found online at https://doi. org/10.1016/j.neuroimage.2019.03.037.

References

Ahlfors, S. P., Han, J., Lin, F.-H., Witzel, T., Belliveau, J. W., Hämäläinen, M. S., & Halgren, E. (2010). Cancellation of EEG and MEG signals generated by extended and distributed sources. Human Brain Mapping, 31(1), 140–149. http://dx.doi.org/10.1002/hbm.20851

Ambron, E., Mas-Casadesús, A., & Gherri, E. (2018). Hand distance modulates the electrophysiological correlates of target selection during a tactile search task. Psychophysiology, e13080. http://dx.doi.org/10.1111/psyp.13080

Azañón, E., & Haggard, P. (2009). Somatosensory processing and body representation. Cortex, 45(9), 1078–1084. https://doi.org/10.1016/j.cortex.2009.01.006

Bernasconi, F., Noel, J.P., Park, H.D., Faivre, N., Seeck, M., Spinelli, L., Schaller, K.,

Bichot, N. P., Rossi, A. F., & Desimone, R. (2005). Parallel and serial neural mechanisms for visual search in macaque area V4. Science, 308(5721), 529–534. http://dx.doi.org/10.1126/science.1109676

Bolognini, N., Rossetti, A., Maravita, A., & Miniussi, C. (2011). Seeing touch in the somatosensory cortex: a TMS study of the visual perception of touch. Human Brain Mapping, 32(12), 2104–2114. http://dx.doi.org/10.1002/hbm.21172

Calvo-Merino, B., Grèzes, J., Glaser, D. E., Passingham, R. E., & Haggard, P. (2006). Seeing or doing? Influence of visual and motor familiarity in action observation. Current Biology, 16(19), 1905–1910. http://dx.doi.org/10.1016/j.cub.2006.07.065

Clark, K., Appelbaum, L.G., van den Berg, B., Mitroff, S.R., Woldorff, M.G., 2015 Apr 1. Improvement in visual search with practice: mapping learning-related changes in neurocognitive stages of processing. J. Neurosci. 35 (13), 5351–5359. https:// doi.org/10.1523/JNEUROSCI.1152-14.2015.

Corbetta, M., & Shulman, G. L. (2002). Control of goal-directed and stimulus-driven attention in the brain. Nature reviews neuroscience, 3(3), 201. https://doi.org/10.1038/nrn755

Dell’Acqua, R., Jolicoeur, P., Pesciarelli, F., Job, R., & Palomba, D. (2003). Electrophysiological evidence of visual encoding deficits in a cross-modal attentional blink paradigm. Psychophysiology, 40(4), 629–639. https://doi.org/10.1111/1469-8986.00064

Downing, P. E., Bray, D., Rogers, J., & Childs, C. (2004). Bodies capture attention when nothing is expected. Cognition, 93(1), B27–B38. http://dx.doi.org/10.1016/j.cognition.2003.10.010

Drew, T., & Vogel, E. K. (2008). Neural measures of individual differences in selecting and tracking multiple moving objects. Journal of Neuroscience, 28(16), 4183–4191. http://dx.doi.org/10.1523/JNEUROSCI.0556-08.2008

Eimer, M. (1996). The N2pc component as an indicator of attentional selectivity. Electroencephalography and Clinical Neurophysiology, 99(3), 225–234. http://dx.doi.org/10.1016/0013-4694(96)95711-9

Eimer, M. (2014). The neural basis of attentional control in visual search. Trends in Cognitive Sciences, 18(10), 526–535. http://dx.doi.org/10.1016/j.tics.2014.05.005

Forster, B., & Eimer, M. (2005). Covert attention in touch: Behavioral and ERP evidence for costs and benefits. Psychophysiology, 42(2), 171–179. http://dx.doi.org/10.1111/j.1469-8986.2005.00268.x

Forster, B., Tziraki, M., & Jones, A. (2016). The attentive homunculus: ERP evidence for somatotopic allocation of attention in tactile search. Neuropsychologia, 84, 158–166. http://dx.doi.org/10.1016/j.neuropsychologia.2016.02.009

Galvez-Pol, A., Calvo-Merino, B., Capilla, A., & Forster, B. (2018a). Persistent recruitment of somatosensory cortex during active maintenance of hand images in working memory. Neuroimage, 174, 153–163. http://dx.doi.org/10.1016/j.neuroimage.2018.03.024

Galvez-Pol, A., Forster, B., & Calvo-Merino, B. (2018b). Modulation of motor cortex activity in a visual working memory task of hand images. Neuropsychologia, 117, 75–83. http://dx.doi.org/10.1016/j.neuropsychologia.2018.05.005

Gamble, M. L., & Luck, S. J. (2011). N2ac: An ERP component associated with the focusing of attention within an auditory scene. Psychophysiology, 48(8), 1057–1068. http://dx.doi.org/10.1111/j.1469-8986.2010.01172.x

Gamble, M. L., & Woldorff, M. G. (2015). Rapid context-based identification of target sounds in an auditory scene. Journal of Cognitive Neuroscience, 27(9), 1675–1684. http://dx.doi.org/10.1162/jocn_a_00814

Hickey, C., Di Lollo, V., & McDonald, J. J. (2009). Electrophysiological indices of target and distractor processing in visual search. Journal of Cognitive Neuroscience, 21(4), 760–775. http://dx.doi.org/10.1162/jocn.2009.21039

Hilimire, M. R., Mounts, J. R., Parks, N. A., & Corballis, P. M. (2011). Dynamics of target and distractor processing in visual search: Evidence from event-related brain potentials. Neuroscience Letters, 495(3), 196–200. http://dx.doi.org/10.1016/j.neulet.2011.03.064

Irimia, A., Van Horn, J. D., & Halgren, E. (2012). Source cancellation profiles of electroencephalography and magnetoencephalography. Neuroimage, 59(3), 2464–2474. http://dx.doi.org/10.1016/j.neuroimage.2011.08.104

Jolicøeur, P., Sessa, P., Dell’Acqua, R., & Robitaille, N. (2006a). Attentional control and capture in the attentional blink paradigm: Evidence from human electrophysiology. European Journal of Cognitive Psychology, 18(4), 560–578. https://doi.org/10.1080/09541440500423210

Jolicøeur, P., Sessa, P., Dell’Acqua, R., & Robitaille, N. (2006b). On the control of visual spatial attention: Evidence from human electrophysiology. Psychological Research, 70(6), 414–424. https://doi.org/10.1007/s00426-005-0008-4

Jones, A., & Forster, B. (2014). Neural correlates of endogenous attention, exogenous attention and inhibition of return in touch. European Journal of Neuroscience, 40(2), 2389–2398. http://dx.doi.org/10.1111/ejn.12583

Keysers, C., & Gazzola, V. (2009). Expanding the mirror: vicarious activity for actions, emotions, and sensations. Current Opinion in Neurobiology, 19(6), 666–671. http://dx.doi.org/10.1016/j.conb.2009.10.006

Keysers, C., Kaas, J. H., & Gazzola, V. (2010). Somatosensation in social perception. Nature Reviews Neuroscience, 11(6), 417. http://dx.doi.org/10.1038/nrn2833

Kiss, M., Van Velzen, J., & Eimer, M. (2008). The N2pc component and its links to attention shifts and spatially selective visual processing. Psychophysiology, 45(2), 240–249. http://dx.doi.org/10.1111/j.1469-8986.2007.00611.x

Luck, S. J. (2014). An introduction to the event-related potential technique. MIT press.

Luck, S. J., & Hillyard, S. A. (1990). Electrophysiological evidence for parallel and serial processing during visual search. Perception & Psychophysics, 48(6), 603–617. http://dx.doi.org/10.3758/BF03211606

Luck, S. J., & Hillyard, S. A. (1994a). Electrophysiological correlates of feature analysis during visual search. Psychophysiology, 31(3), 291–308. http://dx.doi.org/10.1111/j.1469-8986.1994.tb02218.x

Luck, S. J., & Hillyard, S. A. (1994b). Spatial filtering during visual search: evidence from human electrophysiology. Journal of Experimental Psychology: Human Perception and Performance, 20(5), 1000. http://dx.doi.org/10.1037/0096-1523.20.5.1000

Luck, S. J., & Kappenman, E. S. (2011). The Oxford handbook of event-related potential components. Oxford university press.

Luck, S. J., Woodman, G. F., & Vogel, E. K. (2000). Event-related potential studies of attention. Trends in Cognitive Sciences, 4(11), 432–440. http://dx.doi.org/10.1016/S1364-6613(00)01545-X

Martuzzi, R., van der Zwaag, W., Farthouat, J., Gruetter, R., & Blanke, O. (2014). Human finger somatotopy in areas 3b, 1, and 2: a 7T fMRI study using a natural stimulus. Human Brain Mapping, 35(1), 213–226. https://doi.org/10.1002/hbm.22172

Maunsell, J. H. R., & Treue, S. (2006). Feature-based attention in visual cortex. Trends in Neurosciences, 29(6), 317–322. https://doi.org/10.1016/j.tins.2006.04.001

Mazza, V., Turatto, M., & Caramazza, A. (2009). Attention selection, distractor suppression and N2pc. Cortex, 45(7), 879–890. http://dx.doi.org/10.1016/j.cortex.2008.10.009

Mazza, V., Turatto, M., Umiltà, C., & Eimer, M. (2007). Attentional selection and identification of visual objects are reflected by distinct electrophysiological responses. Experimental Brain Research, 181(3), 531–536. http://dx.doi.org/10.1007/s00221-007-1002-4

McCollough, A. W., Machizawa, M. G., & Vogel, E. K. (2007). Electrophysiological measures of maintaining representations in visual working memory. Cortex, 43(1), 77–94. http://dx.doi.org/10.1016/S0010-9452(08)70447-7

Niedenthal, P. M. (2007). Embodying emotion. Science, 316(5827), 1002–1005. http://dx.doi.org/10.1126/science.1136930

Nunez, P. L. (1981). A study of origins of the time dependencies of scalp EEG: I-theoretical basis. IEEE Transactions on Biomedical Engineering, (3), 271–280. http://dx.doi.org/10.1109/TBME.1981.324700

Prime, D. J., & Jolicoeur, P. (2010). Mental rotation requires visual short-term memory: Evidence from human electric cortical activity. Journal of Cognitive Neuroscience, 22(11), 2437–2446. http://dx.doi.org/10.1162/jocn.2009.21337

Sel, A., Forster, B., & Calvo-Merino, B. (2014). The emotional homunculus: ERP evidence for independent somatosensory responses during facial emotional processing. Journal of Neuroscience, 34(9), 3263–3267. http://dx.doi.org/10.1523/JNEUROSCI.0106-13.2014

Sessa, P., Schiano Lomoriello, A., Luria, R., 2018. Neural measures of the causal role of observers' facial mimicry on visual working memory for facial expressions. Soc. Cogn. Affect Neurosci. 13 (12), 1281–1291. http://doi.org/10.1093/scan/nsy095.

Stein, T., Sterzer, P., & Peelen, M. V. (2012). Privileged detection of conspecifics: Evidence from inversion effects during continuous flash suppression. Cognition, 125(1), 64–79. http://dx.doi.org/10.1016/j.cognition.2012.06.005

Teder-Sälejärvi, W. A., McDonald, J. J., Di Russo, F., & Hillyard, S. A. (2002). An analysis of audio-visual crossmodal integration by means of event-related potential (ERP) recordings. Cognitive Brain Research, 14(1), 106–114. https://doi.org/10.1016/S0926-6410(02)00065-4

Tsakiris, M. (2010). My body in the brain: a neurocognitive model of body-ownership. Neuropsychologia, 48(3), 703–712. http://dx.doi.org/10.1016/j.neuropsychologia.2009.09.034

Urgesi, C., Calvo-Merino, B., Haggard, P., & Aglioti, S. M. (2007). Transcranial magnetic stimulation reveals two cortical pathways for visual body processing. Journal of Neuroscience, 27(30), 8023–8030. http://dx.doi.org/10.1523/JNEUROSCI.0789-07.2007

Vogel, E. K., & Machizawa, M. G. (2004). Neural activity predicts individual differences in visual working memory capacity. Nature, 428(6984), 748. http://dx.doi.org/10.1038/nature02447

Wolber, M., Wascher, E., 2003 Apr. Visual search strategies are indexed by event-related lateralizations of the EEG. Biol. Psychol. 63 (1), 79–100.

Woldorff, M. G., Hazlett, C. J., Fichtenholtz, H. M., Weissman, D. H., Dale, A. M., & Song, A. W. (2004). Functional parcellation of attentional control regions of the brain. Journal of cognitive Neuroscience, 16(1), 149-165. http://dx.doi.org/10.1162/089892904322755638

Woodman, G. F., Arita, J. T., & Luck, S. J. (2009). A cuing study of the N2pc component: An index of attentional deployment to objects rather than spatial locations. Brain Research, 1297, 101–111. http://dx.doi.org/10.1016/j.brainres.2009.08.011

Woodman, G. F., & Luck, S. J. (1999). Electrophysiological measurement of rapid shifts of attention during visual search. Nature, 400(6747), 867. http://dx.doi.org/10.1038/23698

Woodman, G. F., & Luck, S. J. (2003). Serial deployment of attention during visual search. Journal of Experimental Psychology: Human Perception and Performance, 29(1), 121. http://dx.doi.org/10.1037/0096-1523.29.1.121

Yantis, S., & Serences, J. T. (2003). Cortical mechanisms of space-based and object-based attentional control. Current opinion in neurobiology, 13(2), 187-193. http://dx.doi.org/10.1016/S0959-4388(03)00033-3