Top Banner
See discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/237086795 Bispectral EEG analysis for knowledge scaffolding in music perception: A mirror neurons-based approach Article in Journal of Computational Methods in Sciences and Engineering · January 2011 DOI: 10.3233/JCM-2011-0367 CITATIONS 0 READS 64 5 authors, including: Some of the authors of this publication are also working on these related projects: i-PROGNOSIS: Intelligent Parkinson early detection guiding novel supportive interventions (EU Horizon 2020) View project Panagiotis Doulgeris Delft Inversion 21 PUBLICATIONS 125 CITATIONS SEE PROFILE Stelios K. Hadjidimitriou Aristotle University of Thessaloniki 20 PUBLICATIONS 75 CITATIONS SEE PROFILE Leontios Hadjileontiadis Aristotle University of Thessaloniki 222 PUBLICATIONS 2,276 CITATIONS SEE PROFILE Stavros M Panas Aristotle University of Thessaloniki 86 PUBLICATIONS 1,571 CITATIONS SEE PROFILE All content following this page was uploaded by Leontios Hadjileontiadis on 29 October 2014. The user has requested enhancement of the downloaded file. All in-text references underlined in blue are added to the original document and are linked to publications on ResearchGate, letting you access and read them immediately.
18

Bispectral EEG analysis for knowledge scaffolding in music perception: A mirror neurons-based approach

Jan 11, 2023

Download

Documents

Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Bispectral EEG analysis for knowledge scaffolding in music perception: A mirror neurons-based approach

Seediscussions,stats,andauthorprofilesforthispublicationat:https://www.researchgate.net/publication/237086795

BispectralEEGanalysisforknowledgescaffoldinginmusicperception:Amirrorneurons-basedapproach

ArticleinJournalofComputationalMethodsinSciencesandEngineering·January2011

DOI:10.3233/JCM-2011-0367

CITATIONS

0

READS

64

5authors,including:

Someoftheauthorsofthispublicationarealsoworkingontheserelatedprojects:

i-PROGNOSIS:IntelligentParkinsonearlydetectionguidingnovelsupportiveinterventions(EU

Horizon2020)Viewproject

PanagiotisDoulgeris

DelftInversion

21PUBLICATIONS125CITATIONS

SEEPROFILE

SteliosK.Hadjidimitriou

AristotleUniversityofThessaloniki

20PUBLICATIONS75CITATIONS

SEEPROFILE

LeontiosHadjileontiadis

AristotleUniversityofThessaloniki

222PUBLICATIONS2,276CITATIONS

SEEPROFILE

StavrosMPanas

AristotleUniversityofThessaloniki

86PUBLICATIONS1,571CITATIONS

SEEPROFILE

AllcontentfollowingthispagewasuploadedbyLeontiosHadjileontiadison29October2014.

Theuserhasrequestedenhancementofthedownloadedfile.Allin-textreferencesunderlinedinblueareaddedtotheoriginaldocumentandarelinkedtopublicationsonResearchGate,lettingyouaccessandreadthemimmediately.

Page 2: Bispectral EEG analysis for knowledge scaffolding in music perception: A mirror neurons-based approach

Galley Proof 6/05/2011; 9:07 File: jcm367.tex; BOKCTP/wynp. 1

Journal of Computational Methods in Sciences and Engineering 11 (2011) 1–17 1DOI 10.3233/JCM-2011-0367IOS Press

Bispectral EEG analysis for knowledgescaffolding in Music Perception: A MirrorNeurons-based approach

Panagiotis Doulgeris, Stelios Hadjidimitriou, Konstantinos Panoulas,Leontios Hadjileontiadis∗ and Stavros PanasAristotle University of Thessaloniki, Faculty of Technology, Department of Electrical and ComputerEngineering, GR – 541 24, Thessaloniki, Greece

Abstract. ‘Motion’ and intention that are involved in the perception of musical structure combined with mirror neuron (MN)system activation are studied in this article. The mechanism of MN involved in the perception of musical structures is seenas a means for cueing the learner on ‘known’ factors that can be used for his/her knowledge scaffolding. To explore suchrelationships, EEG recordings, and especially the Mu-rhythm in the premotor cortex that relates to the activation of MN, wereacquired and explored. Three experiments were designed to provide the auditory and visual stimuli to a group of subjects,including both musicians and non-musicians. The acquired signals, after appropriate averaging in the time domain, wereanalyzed in frequency and bifrequency domains, using spectral and bispectral analysis, respectively. Experimental results haveshown that an intention – related activity shown in musicians could be associated with Mu – rhythm suppression. Moreover, anunderlying ongoing function appearing in the transition from heard sound to imagined sound could be revealed in the bispectrumdomain. A Mu-rhythm modulation provoked by the MNs could cause bispectral fluctuations, especially when visual stimulationis combined with an auditory one for the case of musicians. These results pave the way for further exploitation of the roleofMNs in music and, in general, knowledge perception.

Keywords: Music, motion, intention, mirror neurons, EEG, Mu-rhythm, spectrum, bispectrum

1. Introduction

Ludwig van Beethoven has been maybe the most famous example of a musician with hearing loss.Suffering a severe form of tinnitus, he became totally deaf around 1814, in the age of 44. One of hishearing aids used to be a special rod attached to the soundboard on the piano that he could bite, sothe vibrations would then transfer from the piano to his jaw to increase his perception of the sound(Maynard Solomon’s biography). This example indicates that underlying features of musical soundsplay a major role in music perception. Music consists of sounds, but not all sounds are music. It hasbeen suggested that music, like language, involves an intimate coupling between the perception andproduction of hierarchically organized sequential information, the structure of which has the ability tocommunicate meaning [1]. Musical structure consists of basic elements that are combined in patterns

∗Correspondence to: Dr. Leontios Hadjileontiadis, Signal Processing and Biomedical Technology Unit, Department ofElectrical and Computer Engineering, Aristotle University of Thessaloniki, University Campus, GR – 541 24, Greece. Tel.:+30 2310 996340; Fax: +30 2310 996312; E-mail: [email protected].

1472-7978/11/$27.50 2011 – IOS Press and the authors. All rights reserved

Page 3: Bispectral EEG analysis for knowledge scaffolding in music perception: A mirror neurons-based approach

Galley Proof 6/05/2011; 9:07 File: jcm367.tex; BOKCTP/wynp. 2

2 P. Doulgeris et al. / Bispectral EEG analysis for knowledge scaffolding in Music Perception

whose performance and perception are governed by combinatorial rules or a sort of musical grammar [1].Auditory features of the musical signal are primarily processed in the superior temporal gyrus (auditorycortex). However, the processing of structural features activates regions of the human brain that havebeen related to the perception of semantic and motor tasks. Amechanism involved in the structuralanalysis of communicative signals, like music, is the mirror neuron (MN) system.

Two studies in 2000 and 2002, using functional magnetic resonance imaging (fMRI) analysis andEEG analysis respectively, revealed that areas of the brainthat were directly linked to linguistic tasks,are also involved in the processing of musical structures [2,3]. In these studies, non – musicians werepresented with chord sequences containing unexpected events and key modulations. The activationshown during the delicate key transitions supports the hypothesis that the human brain has a generalability for a highly differentiated processing of music. In2001, a magnetoencephalogram-based studysuggesting that musical syntax is processed in Broca’s area, a region of the human brain involved inlanguage perception, was presented [2]. Moreover, in 2006,a positron emission tomography-basedstudy supported the idea that there are common regions in thehuman brain for language and musicprocessing, as Brodmann area 44 [3], whereas two studies that used fMRI analysis, revealed sharednetworks for auditory and motor processing [4,5]. Furthermore, in 2006, a model of the integration ofmotor commands with their associated perceived sounds in vocal production was proposed. This modelwas based on neurobiologically plausible principles, i.e., receptive fields responding to a subset of allpossible stimuli, population coded representations of auditory stimuli and motor commands and a simpleHebbian–based weight update [6].

The activation of MN system in response to music stimulationthat is targeted to ‘motion’ and intentionis evaluated in this work. These qualities of music are inherent in musical performance and creationand could be seen as basic elements for music knowledge scaffolding. For instance, let us imaginetones ascending or descending on music score (‘motion’) or amusician trying to play the next noteduring the performance of a well–known musical piece (intention) (see Fig. 1). ‘Motion’ is conveyedby the augmenting or diminishing pitch, while intention underlies in every communicative signal withhierarchical structure. The activation of the MN system from music tasks targeted to specific structuralfeatures of musical creation and performance has not yet been monitored using electroencephalogram(EEG) analysis. Here, fluctuations of the Mu–rhythm, related to the activation of MNs, were exploredusing EEG recordings and bi/spectral analysis.

2. Background

2.1. Mirror neuron system

The MN system consists of a whole network of neurons and was initially discovered in macaquemonkeys, in the ventral premotor cortex (probably the equivalent of the inferior frontal gyrus in humans)and in the anterior inferior parietal lobule [7]. These neurons are active when the monkeys perform certaintasks, but they also fire when the monkeys watch or hear someone else perform the same specific task.In humans, brain activity consistent with MNs has been foundin the premotor cortex (inferior frontalcortex) and the inferior parietal cortex [8]. Cognitive functions, like imitation and the understanding ofintentions, have been linked to the activation of the MN system [9–11].

As far as music is concerned, a number of studies suggest the implication of Brodmann area 44 inmusic cognition [2,3]. Brodmann area 44 is situated just anterior to premotor cortex (mirror neuronsystem) [8]. Together with Brodmann area 45 it comprises Broca’s area, a region involved in the

TURBO_X
Typewritten Text
TURBO_X
Typewritten Text
[2] -> [4]
TURBO_X
Typewritten Text
[3] -> [5]
TURBO_X
Typewritten Text
[4,5] -> [6,7]
TURBO_X
Typewritten Text
[6] -> [8]
TURBO_X
Typewritten Text
[7] -> [9]
TURBO_X
Typewritten Text
[8] -> [10]
TURBO_X
Typewritten Text
TURBO_X
Typewritten Text
TURBO_X
Typewritten Text
[9-11] -> [11-13]
TURBO_X
Typewritten Text
[8] -> [10]
Page 4: Bispectral EEG analysis for knowledge scaffolding in music perception: A mirror neurons-based approach

Galley Proof 6/05/2011; 9:07 File: jcm367.tex; BOKCTP/wynp. 3

P. Doulgeris et al. / Bispectral EEG analysis for knowledge scaffolding in Music Perception 3

Fig. 1. (A) Vectors present the underlying motion conveyed by tones ascending and descending on a music notation scheme.(B) A musician intending to play the next note during the performance of a musical piece.

processing of hierarchical structures that are inherent incommunicative signals, like language andaction [12]. Moreover, auditory features of the musical signal, which are processed primarily in theprimary auditory cortex (superior temporal gyrus), are combined with structural features of the ‘motion’information conveyed by the musical signal in the posteriorinferior frontal gyrus and adjacent premotorcortex (mirror neuron system) [1,8]. Certain neuroimagingevidence suggests that frontoparietal motor-related regions including, prominently, the posterior inferior frontal gyrus (Broca’s region), as well as theposterior middle premotor cortex, were active during passive listening of music pieces by professionalmusicians [4,5]. Figure 2 displays the regions of the brain related to music perception.

The activation of MNs can be detected using EEG analysis. Mu-rhythm could reflect visuomotorintegrative processes, and would ‘translate seeing and hearing into doing’. Mu-rhythm is alpha rangeactivity (8–12 Hz) that is seen over the sensorimotor cortex. Fluctuation on Mu-rhythm during theobservation of a motor action (suppression of Mu-rhythm) ishighly similar to the one seen duringthe direct performance of the action by the individual (greater suppression of Mu-rhythm) [13]. Thesuppression is due to the desychronization of the underlying cell assemblies, reflecting an increased loadin the related group of neurons [14]. This motor resonance mechanism, witnessed by a Mu-rhythmmodulation, is provoked by the MNs.

2.2. Music structure

Hearing provides us with sensory information arousing fromour environment. As with any other formof information, the human brain tends to focus on the pieces of greater interest and extract messages. In

TURBO_X
Typewritten Text
[12] -> [14]
TURBO_X
Typewritten Text
[1,8] -> [1,10]
TURBO_X
Typewritten Text
[4,5] -> [6,7]
TURBO_X
Typewritten Text
[13] -> [15]
TURBO_X
Typewritten Text
[14] -> [16]
Page 5: Bispectral EEG analysis for knowledge scaffolding in music perception: A mirror neurons-based approach

Galley Proof 6/05/2011; 9:07 File: jcm367.tex; BOKCTP/wynp. 4

4 P. Doulgeris et al. / Bispectral EEG analysis for knowledge scaffolding in Music Perception

Fig. 2. Regions of the human brain related to the mirror neuron system (premotor cortex) and semantic tasks (Broca’s area).Position of the electrodes (C3 and its homologue C4) over thesensorimotor cortex and an example of the acquired signals(Mu–rhythm) across ten trials during the Experiment 1, along with the resulting averaged signal.

this way, a communication port is created for the human brainand these ‘messages’ inherited in differentqualities of sound. For example, a ‘happy’ (major) chord differs from a ‘sad’ (minor) one just in the pitchof one note (from third major to third minor). Pitch is underlying the sound perception and it is only thestructure of the human auditory system (cochlea) that allows us to have a clear view of the frequenciespresent in a sound. However, this is not the only quality of music that we can perceive. Seven basicelements of music can be defined: pitch, rhythm, tempo, timbre, loudness, tone, and the spatial qualitiesof sounds [15]. While pitch is obviously defined as the dominant frequency of a tone, another elementof music, timbre, is also frequency related. The timbre of sound obviously depends on the differentharmonics present in the tone. Moving on to the time domain, rhythm and tempo are strongly related.

TURBO_X
Typewritten Text
[15] -> [17]
Page 6: Bispectral EEG analysis for knowledge scaffolding in music perception: A mirror neurons-based approach

Galley Proof 6/05/2011; 9:07 File: jcm367.tex; BOKCTP/wynp. 5

P. Doulgeris et al. / Bispectral EEG analysis for knowledge scaffolding in Music Perception 5

Table 1Anthropometric characteristics of the subjects, their musical skills, and therelated experiments

Subject No. Sex Age Musical skills Experiment No.1 Male 24 Musician (Intermediate) 1,2,32 Male 23 Musician (Intermediate) 1,23 Male 24 Musician (Intermediate) 1,2,34 Female 23 Musician (Beginner) 1,2,35 Male 24 Musician (Beginner) 26 Female 22 Non-Musician 1,2,37 Male 23 Non-Musician 1,2,3

Furthermore, over the last centuries, a solid framework often referred to as the harmony of music,has been advanced. Harmonic progression of music describesthe sequence of chords (tones heardsimultaneously) that can give meaning (communicate a message) to the listener. Cadences are one typeof harmonic progression that is often used at the end of sections to settle the thought. Two commoncadences are plagal (IV to I) and authentic (V to I). The framework of theory and harmony of musicare only a guideline that most of the composers of contemporary music do not follow. Their presencein global music though is so widespread that these harmonic progressions can be recognized by thecommon listener.

Another aspect of music that is more common to people is style. Style is defined partly by theinstruments (the timbre) participating and partly from therhythm and patterns followed during theperformance of a musical piece. Apart from the aforementioned elements, several combinations ofthese elements seem to affect different groups of people or people in different ways, thus leading to thedefinition of a variety of styles.

3. Material and methods

3.1. Material

3.1.1. SubjectsSeven healthy subjects have participated in the experiments. They were divided into musicians and

non- musicians. Table 1 presents the age, sex and music skills of each subject. Subjects that hadmore than 5 years of musical training are described as intermediates whereas the rest are described asbeginners. The last column refers to which experiments the subjects participated in.

3.1.2. EEG acquisition deviceEEG recordings were conducted using the g.MOBIlab portablebiosignal acquisition system (4 EEG

channels bipolar, Filters: 0.5–30 Hz, Sensitivity: 100µV, Data acquisition: A/D converter with 16 bitsresolution and sampling frequency of 256 Hz, Data transfer:wireless, Bluetooth ‘Class I’ technology,meets IEC 60601-1, for research application, no medical use) (g.tec medical and electrical engineering,Guger Technologies). An example of the acquired EEG signal is shown in Fig. 2.

3.1.3. InterfaceThe experiments were designed and conducted with Max/MSP 4.5 software (Cycling ’74) on a PC

(Pentium 3.2 GHz, RAM 1 GB). In order to precisely synchronize the g.MOBIlab device with theMax/MSP software, an external object for Max/MSP was created in C++, using the g.MOBIlab API.

Page 7: Bispectral EEG analysis for knowledge scaffolding in music perception: A mirror neurons-based approach

Galley Proof 6/05/2011; 9:07 File: jcm367.tex; BOKCTP/wynp. 6

6 P. Doulgeris et al. / Bispectral EEG analysis for knowledge scaffolding in Music Perception

Fig. 3. (A) Experiments interface in Max/MSP. (B) System configuration.

Thus, we were able to open and close the device, start the acquisition and store the acquired datain text files (.txt) through Max/MSP. No visualization of theacquired signal was available during theexperiments. Separate interfaces were designed for each experiment providing the essential auditoryand visual stimulation. Figure 3(A) shows the designed interface, whereas Fig. 3(B) depicts the systemconfiguration.

3.2. Methods

3.2.1. Experimental designThree experiments were designed. In particular:

Experiment 1 (Intention ): This experiment consisted of seven consecutive instances. During the firsttwo instances no tone was heard (relax state). At the beginning of the third, fourth, fifth and seventhinstance, a tone (A4–440 Hz) was provided to the subjects; the tone was missing in the sixth instance.The importance of the last tone relies on the fact that the subjects should conceive the true ending ofthe sequence of tones. The time interval between the instances was set at 2 sec and the tone durationwas 400 msec. The subjects had their eyes closed. We expectedthe subjects to conceive the tempo andimagine the missing tone at the beginning of the sixth instance.

Experiment 2 (Intention-Harmonic Analysis): This experiment consisted of four acoustic blocks.Each block analyzed the same 4-voices major chord into its four notes starting with the sound of thechord itself. During the chord analysis, each block-apart from the first-omitted one note, different foreach block, but the same for each trial. The time interval between the instances was set at 1.5 sec andthe tone duration was 500 msec. The major chord chosen was G+ starting with G4 (392 Hz) note.The subjects had their eyes closed. The subjects, musiciansespecially, were expected to conceive thesequence of notes and the underlying intention in order to imagine the omitted ones.

Experiment 3 (‘Motion’ ): This experiment consisted of six consecutive instances.During the firsttwo instances no tone was heard (relax state). At the beginning of the following four instances, a toneof augmenting pitch (‘motion’), each time (E4–329.6 Hz, F4–349.2 Hz, G4–392 Hz and A4–440 Hz),

Page 8: Bispectral EEG analysis for knowledge scaffolding in music perception: A mirror neurons-based approach

Galley Proof 6/05/2011; 9:07 File: jcm367.tex; BOKCTP/wynp. 7

P. Doulgeris et al. / Bispectral EEG analysis for knowledge scaffolding in Music Perception 7

was heard (auditory stimulus). The subjects could also see the corresponding notes on a music notationscheme on a computer screen (visual stimulus). The time interval between the instances was set at 2 secand the tone duration was 400 msec. A series of trials was conducted by providing the subjects withauditory and visual stimulus and another series was conducted by providing the subjects with auditorystimulus only. The subjects had their eyes open during all trials. Subjects were expected to conceive the‘motion’ in both series.

A graphical representation of the aforementioned experiments is shown in Fig. 4.

3.2.2. EEG recordingsThe EEG recordings were conducted according to the 10/20 international system of electrode place-

ment. The subjects were wearing the EEG cap provided with theg.MOBIlab device. One bipolarchannel was used and the electrodes were placed at the C4 position (if the subject was left handed) or atthe C3 position (if the subject was right handed) [4], where sensorimotor activity would most likely bepresented [14] (see Fig. 2). The subjects sat still during all trials. The acoustic stimulus was providedto the subjects by headphones (Firstline Hi – Fi Stereo Headphones FH91) and the visual stimulation inExperiment 3 could be seen on the computer screen. A series oftrials (see below) per subject and perexperiment was conducted (see Fig. 2).

3.2.3. Data analysisOff–line data analyses were carried out with MatLab 7 (Mathworks). A bandpass filter (Butterworth

IIR, 6th order, lower cut–off frequency 8 Hz and upper cut–off frequency 12 Hz) was designed in orderto isolate the alpha range and at the same time the Mu-rhythm.The acquired EEG signals per subject andper experiment were synchronized and averaged across all trials in the time domain in order to discardrandom phenomena, or other artefacts, and produce the underlying evoked potential for each case. Thelatter, was then analyzed using spectral and high–order spectral analyses, as described below.

3.2.3.1. Spectral analysisThe filtered signals were scaled and then segmented using a time window in several parts according

to the experiment’s design. The process of segmentation of the signals and the discrimination of thedifferent states, for each experiment, is described below.

Experiment 1: Ten trials were conducted for each subject. The filtered signal from each trial wassegmented using a 2 sec time window in seven parts corresponding to each of the seven instances of theexperiment. Three states were distinguished: relax state (first and second instance), the state of auditorystimulation (third, fourth, fifth and seventh instance) andthe state during which the subject imagined themissing tone (sixth instance).

Experiment 2: Five trials were conducted for each subject. The filtered signal from each trial wassegmented using a 1500 msec time window in 20 parts corresponding to each of the 20 instances. Threestates were distinguished: the state in which the subjects listened to the chord, the state in which thesubjects listened to the notes and the state during which thesubject imagined the missing notes.

Experiment 3: Five trials with acoustic and visual stimulation and five trials with acoustic and no visualstimulation were conducted for each subject. The filtered signal from each trial was segmented using a 2sec time window in six parts corresponding to each of the six instances. Two states were distinguished inboth cases (acoustic and visual stimulus, acoustic stimulus and no visual stimulation): relax state (firstand second instance) and the state of auditory stimulation (third, fourth, fifth and sixth instance).

TURBO_X
Typewritten Text
[4] -> [6]
TURBO_X
Typewritten Text
[14] -> [16]
Page 9: Bispectral EEG analysis for knowledge scaffolding in music perception: A mirror neurons-based approach

Galley Proof 6/05/2011; 9:07 File: jcm367.tex; BOKCTP/wynp. 15

8 P. Doulgeris et al. / Bispectral EEG analysis for knowledge scaffolding in Music Perception

Fig. 4. Schematic representation of the experimental design: (A) Experiment 1 (Intention). (B) Experiment 2 (Intention –Harmonic Analysis). (C) Experiment 3 (‘Motion’).

Page 10: Bispectral EEG analysis for knowledge scaffolding in music perception: A mirror neurons-based approach

Galley Proof 6/05/2011; 9:07 File: jcm367.tex; BOKCTP/wynp. 8

P. Doulgeris et al. / Bispectral EEG analysis for knowledge scaffolding in Music Perception 9

The power spectral density (based on Fast Fourier Transformation) and the power of the signalwere estimated for each part. The mean value and the standarddeviation of the Mu-rhythm powercorresponding to the different states of each experiment for all trials were estimated, for each subject.

3.2.3.2. Bispectral analysisThe averaged and filtered EEG data were segmented using a timewindow, as described in spectral

analysis section. Third-order statistical analysis was conducted and the bispectrum corresponding toeach part was estimated. Definitions about third-order statistics and the conventional bispectrum aredescribed below [16,17].

Considering a real process{X(k)} that is third-order stationary and has zero mean, we can define itsthird-order moment sequenceR (m, n) as follows:

R(m,n) ≡ E{X(k)X(k + m)X(k + n)}, (1)

whereE{.} denotes the expectation value. Since the process{X(k)} has zero mean (i.e.,E{X(k)} =0) the third-order cumulant sequenceC(m,n) becomes identical to the third-order moment sequenceR(m,n). The respective bispectrum can be defined as the double Fourier transform ofR(m,n), i.e.:

B(ω1, ω2) =

+∞∑

m=−∞

+∞∑

n=−∞

R(m,n) exp{−j(ω1m + ω2n)} (2)

for |ω1| , |ω2| 6 π.Due to 2π periodicity ofB (ω1, ω2) in ω1 andω2, the knowledge of the bispectrum in the triangular

regionω2 > 0, ω1 > ω2, ω1 + ω2 6 π is enough for a complete description of the bispectrum [16].Ingeneral,B (ω1, ω2) is complex and a sufficient condition for its existence is that R(m,n) is absolutelysummable. The symmetry properties ofR(m,n) lead to a calculation ofR(m,n) in a infinite triangle,bounded by the linesm > 0 andm = n. Knowledge of third-order moments at points on this triangleis enough for a complete description of the bispectrum. Due to the infinite area of this triangle, we haveto choose an upper limit (L3) in order to estimate a third-order sequence of{X(k)} of finite length and,afterwards, the corresponding bispectrum. Thus, Eq. (2) can be modified as follows:

B(ω1, ω2) =

+∞∑

m=−∞

+∞∑

n=−∞

R(m,n)w(m∆3, n∆3) exp{−j(ω1m + ω2n)}, (3)

wherew(m∆3, n∆3) is a continuous window function of bounded support and∆3 is the bandwidthusually taken∆3 = 1/L3 [16]. Note that|m| , |n| 6 L3.

4. Results and discussion

The three features of the EEG recordings monitored at the present study are the spectrum, in termsof the signal power of each state, the transition of the bispectral frequency pairs and the variance of thebispectral coefficients along the different instances of the experiments. Figure 5 displays in the formof box plots the average spectral power of the Mu-rhythm corresponding to the relax state, the state ofauditory stimulation and the state during which the subjectimagined the missing tone, for all subjectsparticipating in the Experiment 1. These results indicate aMu-rhythm suppression both during thesecond and the third state.

TURBO_X
Typewritten Text
[16,17] -> [18,19]
TURBO_X
Typewritten Text
[16] -> [18]
Page 11: Bispectral EEG analysis for knowledge scaffolding in music perception: A mirror neurons-based approach

Galley Proof 6/05/2011; 9:07 File: jcm367.tex; BOKCTP/wynp. 9

10 P. Doulgeris et al. / Bispectral EEG analysis for knowledge scaffolding in Music Perception

Fig. 5. Experiment 1: Box plots of the estimated average spectral power at each state: Relax state (RS), Auditory stimulation(AS), state in which the subjects imagined the missing tone (Imagined tone).

Figure 6 shows an example from the bispectrum analysis of thedata from the subjects participating inthe Experiment 1, for instances (steps) 5 and 6. According toFig. 6, the following could be noted:

– A transition of high bispectral coefficients from frequencypairs located on the two axes towardsfrequency pairs located on the diagonal during the three heard tones, and

– a clear ‘echoing’ effect that appears in the transition fromthe third tone towards the missing one.As shown in Fig. 6, bispectral coefficients of the missing tone appear at the same frequency pairs ofthe ones of the last tone.

Figure 7 shows in the form of box plots the average spectral power of the Mu-rhythm correspondingto the chord state, the notes state and the state during whichthe subject imagined the missing tone, forall subjects participating in the Experiment 2. As it is seenfrom Fig. 7, the average spectral power ofthe three states appears to be stable.

Figure 8 illustrates in the form of box plots the average spectral power of the Mu-rhythm correspondingto the relax state and the notes state, for all subjects participating in the Experiment 3. From this figure,Mu-rhythm suppression is observed during the second state.It is noteworthy that results betweenmusicians and non–musicians showed no statistical difference whatsoever.

In the case of the relax state versus the imagined tone duringthe Experiment 1, Mu–rhythm suppressionhas been observed for all musicians. Non musicians’ response was contradicting as subject six showedno suppression, whereas subject seven did. Such suppression can be linked directly to the mirror neuronsystem activation, proposing an underlying procedure of musical intention. As far as bispectral analysis isconcerned, the transition of the high bispectral coefficients from frequency pairs on the two axes towardsfrequency pairs located on the diagonal during the three heard tones shows a shifting to non-linearity that

Page 12: Bispectral EEG analysis for knowledge scaffolding in music perception: A mirror neurons-based approach

Galley Proof 6/05/2011; 9:07 File: jcm367.tex; BOKCTP/wynp. 10

P. Doulgeris et al. / Bispectral EEG analysis for knowledge scaffolding in Music Perception 11

Fig. 6. Bispectral Analysis of the EEG signals from Experiment 1; only the primary area of bispectrum is shown: Step 5 (Fifthinstance) – Third heard tone; Step 6 (Sixth instance) – Missing tone; Transition of the bispectrum corresponding to the twoaforementioned instances and ‘echoing’ effect, for all subjects.

appears at higher frequencies (i.e. from 10 to 20 Hz). This non-linearity reveals a self-quadratic phasecoupling between these harmonic pairs, implying the generation of harmonics at the sum and differenceof the frequencies of the pair. However, this pattern of bispectral coefficients was very similar betweenthe third heard tone and the missing one for all subjects, thus providing us with an ‘echoing’ effect duringthis transition. This suggests a continuous function, i.e., the perception of intention.

According to the box plots of Fig. 7 corresponding to the Experiment 2, subjects one, two and fourwho belong to the musicians’ group showed no fluctuation of the Mu–rhythm average spectral power atany state. On the contrary, in the case of subject three and five (also belonging to the same group), theMu–rhythm average spectral power was higher for the third state (imagined notes). Non musicians alsodisplayed contradicting results, as in the case of subject six (Mu–rhythm power attenuated for the thirdstate), whereas, in the case of subject seven the equivalentspectral power was higher. Consequently,no safe conclusion can be drawn concerning the response of MNs from Experiment 2. This was alsoevident in the bispectrum domain.

Furthermore, according to the box plots of Fig. 8 corresponding to the Experiment 3, musiciansdisplayed Mu–rhythm attenuation in the case of relax state versus auditory and visual stimulation. Duringthe no-visual stimulation trials, subjects one and two displayed similar response as to the previous trials,whereas subject four did not. Non-musicians’ response was once again contradicting in both cases.

Page 13: Bispectral EEG analysis for knowledge scaffolding in music perception: A mirror neurons-based approach

Galley Proof 6/05/2011; 9:07 File: jcm367.tex; BOKCTP/wynp. 11

12 P. Doulgeris et al. / Bispectral EEG analysis for knowledge scaffolding in Music Perception

Fig. 7. Experiment 2: Box plots of the estimated average spectral power at each state: state in which the subjects listen to chord(Chord), state in which the subjects listen to notes (Notes), state in which the subjects imagined the omitted notes (Omittednote).

The aforementioned results conclude that the visual stimulus (virtualized by an ascending note on anotation scheme), boosted the perception of motion conveyed by the ascending pitch of the note. In thebispectrum domain, an increase in the bispectrum values is noticed for the case of visual stimulationcompared to that of no-visual stimulation (see Fig. 9 for an indicative example from subject 1). Moreover,no-visual stimulation has resulted in diffused bispectralvalues and in a gradual attenuation across thesix states (steps) of the Experiment 3, implying a gradual transition from the non-Gaussianity and non-linearity towards the Gaussian and linear assumption. On the contrary, when the visual stimulationwas employed, less diffused bispectral values were noticedand the attenuation was limited between thesecond and fourth state (step), exhibiting a gradual increase in the bispectral values when moving fromstate (step) five to six (see Fig. 9); hence, implying a shifting towards an non-linear and non-Gaussianbehaviour. These results, also observed in the cases of all musicians, justify the Mu-rhythm modulationprovoked by the MNs. Nevertheless, for the cases of non-musicians the degree of fluctuation in theintensity of the bispectrum was smaller, both without and with visual stimulation, indicating a smallersensitivity in the Mu-rhythm modulation provoked by the MNs. This might be explained by the lack ofmusical experience in distinguishing the pitch differences and, hence, in perceiving an ascending motionacross the six states of the Experiment 3.

From the above experimental analysis it is clear that music perception can be viewed as a patternrecognition process by the brain, in analogy to the more familiar pattern recognition processes in the

Page 14: Bispectral EEG analysis for knowledge scaffolding in music perception: A mirror neurons-based approach

Galley Proof 6/05/2011; 9:07 File: jcm367.tex; BOKCTP/wynp. 16

P. Doulgeris et al. / Bispectral EEG analysis for knowledge scaffolding in Music Perception 13

Fig. 8. Experiment 3: Visual stimulation and no visual stimulation. Box plots of the estimated average spectral power ateachstate (Relax state, Auditory stimulation).

Page 15: Bispectral EEG analysis for knowledge scaffolding in music perception: A mirror neurons-based approach

Galley Proof 6/05/2011; 9:07 File: jcm367.tex; BOKCTP/wynp. 17

14 P. Doulgeris et al. / Bispectral EEG analysis for knowledge scaffolding in Music Perception

Fig. 9. Bispectral Analysis of the EEG signals from Experiment 3 for the case of Subject 1, without and with visual stimulationduring the auditory stimulation, across the six instances (Steps); only the primary area of bispectrum is shown. The differencesin the transition from non-Gaussianity and non-linearity to Gaussian and linear assumptions are evident, when comparing thestates without and with visual stimulation along with the auditory one.

Page 16: Bispectral EEG analysis for knowledge scaffolding in music perception: A mirror neurons-based approach

Galley Proof 6/05/2011; 9:07 File: jcm367.tex; BOKCTP/wynp. 12

P. Doulgeris et al. / Bispectral EEG analysis for knowledge scaffolding in Music Perception 15

visual system. Our brain carries (or builds through experience) ‘templates’ against which the signalsof incoming sounds are compared-if there is a match with the template that corresponds to a harmonictone, a musical tone sensation with definite pitch is evoked.In addition, ‘motion’ and intension in musicfacilitates this pattern recognition, and in an analogy with the optical system, if part of the acousticalstimulus is missing, or if the stimulus is somewhat distorted, the brain is still capable of providing thecorrect sensation. In the case of pitch, pattern recognition theories postulate that tones which are notpresent may be perceptible in the same way contours that are not present may be perceived in vision [21].MNs as seen through the Mu-rhythm modulation seems to support such pitch prediction (correction)and probably could even foster a kind of ‘acoustical illusions’. These are often met when we imaginemusic, a process of crucial relevance to any composer. The involvement of visual stimulation results ina mental image of sound from the triggering of an inverse process: a neural command from the frontaland pre-frontal cortices elicits specific neural activity that propagates all the way ‘down’ to the primaryauditory cortex. The activity triggered is very similar to the one that would appear, had the imaginedsound actually been fed into the ears. In other words, if we imagine the ‘Ta-ta-ta-taah’ of the first bars ofBeethoven’s Fifth Symphony, not much difference happens inour brain than when we actually hear thatmusic played in concert. The role of MNs towards this direction seems to be central and understandingof their relation with the music qualities could really expand the way we see knowledge scaffolding inmusic perception.

5. Concluding remarks

The response of MN cells to intention and ‘motion’ involved in musical structures was studied inthis paper. Three experiments were designed. Two of them targeted to intention and one to motion.Electroencephalograph recordings were conducted on sevensubjects, musicians and non–musicians, andspectral and bispectral analyses were implemented on the data acquired. Experimental results showed thatMu–rhythm suppression displays an intention–related activity shown in musicians, whereas the bispectral‘echoing’ effect supports the idea of an underlying ongoingfunction, appearing in the transition fromheard sound to imagined sound. Moreover, a Mu-rhythm modulation provoked by the MNs was linked tobispectral fluctuations, especially when visual stimulation was combined with an auditory one. Despitethe fact that this research is only a first approach to the topic, the results were rather intriguing leading tofurther investigation of the role of MNs in music perception. The aforementioned experiments are beingextended and further experiments targeted on other sonic qualities are on the way, involving larger subjectgroups of professional musicians, and bigger datasets. Furthermore, the research will be extended topeople suffering from hearing problems and people with cochlea implants. Acoustic tests concerning theperception of timbre as expressed through different cello techniques, accompanied by visual stimulation(actual performance and real time spectrogram) are being designed. Spatial qualities of the musicalsignal will be also investigated by representing the spatial ‘motion’ with vectors indicating the spatialfluctuations of sound. Moreover, experiments concerning spectromorphology and harmonic style arebeing considered. In order to expand the applied signal processing techniques, wavelet analysis will beimplemented, in addition to spectral and bispectral analysis, by extracting a wavelet from the acquiredevent related potentials. Similar analysis has been conducted in research concerning the activation ofthe primary motor cortex during action observation and has yielded better monitoring of the fluctuationsof the Mu – rhythm power for both the time and frequency domain[20]. Moreover, the combination ofwavelet with bispectrum analysis, known as wavelet bispectrum [22], is highly prioritized, as it couldprovide an information representation at the domain, capturing dynamic phenomena in the transition of

TURBO_X
Typewritten Text
TURBO_X
Typewritten Text
[20] -> [21]
TURBO_X
Typewritten Text
TURBO_X
Typewritten Text
TURBO_X
Typewritten Text
TURBO_X
Typewritten Text
[21] -> [20]
Page 17: Bispectral EEG analysis for knowledge scaffolding in music perception: A mirror neurons-based approach

Galley Proof 6/05/2011; 9:07 File: jcm367.tex; BOKCTP/wynp. 13

16 P. Doulgeris et al. / Bispectral EEG analysis for knowledge scaffolding in Music Perception

Fig. 10. Updated experiments interface in Max/MSP for extending the signal acquisition and analysis capabilities.

the perception of the sound qualities. Finally, the existing system is being extended by optimizing thesynchronization between the recording device and the Max/MSP software, using the computer’s clock,and by adding real – time visualization of the acquired EEG signals to an updated experiments interface(see Fig. 10).

References

[1] I. Molnar–Szakacs and K. Overy, Music and mirror neurons: from motion to ‘e’motion,SCANI (2006), 234–241.[2] S. Koelsch, T. Gunter, A. Friederici and E. Schroger, Brain indices of music processing: “Non-musicians” are musical,

J Cogn Neurosci12(3) (2000), 520–541.[3] S. Koelsch, T. Gunter, D. Von Cramon, S. Zysset, G. Lohmann and A. Friederici, Bach speaks: A cortical “language

network” serves the processing of music,NeuroImage17 (2002), 956–966.[4] B. Maess, S. Koelsch, T. Gunter and A. Fiederici, Musicalsyntax is processed in Broca’s area: a MEG study,Nature

Neuroscience4 (2001), 540–545.[5] S. Brown, M. Martinez and L. Parsons, Music and language side by side in the brain: a PET study of the generation of

melodies and sentences,European Journal of Neuroscience23(10) (2006), 2791–2803.[6] A. Lahav, E. Saltzman and G. Schlaug, Action representation of sound: audiomotor recognition network while listening

to newly acquired sounds,The Journal of Neuroscience27(2) (2007), 308–314.[7] M. Bangert, T. Peschel, G. Schlaug, M. Rotte, D. Drescher, H. Hinrichs, H.J. Heinze and E. Altenmuller, Shared networks

for auditory and motor processing in professional pianists: Evidence from fMRI conjunction,Neuroimage30 (2006),917–926.

[8] G. Westerman and E.R. Miranda, Modeling the developmentof mirror neurons for auditory motor integration,TheJournal of New Music Research31(4) (2002), 367–375.

[9] G. Rizzolatti and L. Graighero, The mirror neuron system, Annual Review of Neuroscience27 (2004), 169–192.[10] I. Logothetis and I. Milonas, Logotheti Neurology, University Studio Press, Thessaloniki, 2004.[11] B. Calvo-Merino, J. Grezes, D.Glaser, R. Passingham and P. Hanggard, Seeing or doing? Influence of visual and motor

familiarity in action observation,Current Biology16 (2006), 1–6.[12] G. Rizzolatti, Fogassi L. and V. Gallese, Mirrors in themind, Scientific American (November 2006), 30–37.[13] J. Grezes, N. Costes and J. Decety, The effects of learning and intention on the neural network involved in the perception

of meaningless actions,Brain 122(1999), 1875–1887.

Page 18: Bispectral EEG analysis for knowledge scaffolding in music perception: A mirror neurons-based approach

Galley Proof 6/05/2011; 9:07 File: jcm367.tex; BOKCTP/wynp. 14

P. Doulgeris et al. / Bispectral EEG analysis for knowledge scaffolding in Music Perception 17

[14] M. Grossman, A central processor for hierarchically – structured material: evidence from Broca’s aphasia,Neuropsy-chologia18 (1980), 299–308.

[15] J. Pineda, L. Oberman, E.Hubbard, J. McCleery, E. Altschuler and V. Ramachandran, EEG evidence for mirror neurondysfunction in autism spectrum disorders,Cognitive Brain Research24(2) (2005), 109–198.

[16] S. Muthukumaraswamy, B. Johnson and N. McNair, Mu – rhythm modulation during observation of an object – directedgrasp,Cognitive Brain Research19(2) (2004), 195–201.

[17] D. Levitin, This is your brain on music: The science of a human obsession, New York Dutton, ISBN 0-525-94969-0,2006.

[18] L. Hadjileontiadis, Advanced signal processing of biomedical signals, Aristotle University of Thessaloniki Press, Thes-saloniki, 2000.

[19] L. Hadjileontiadis and S. Panas, Higher – order statistics: a robust vehicle for diagnostic assessment and characterizationof lung sounds,Technology and Healthcare5 (1997), 359–374.

[20] S. Muthukumaraswamy and B. Johnson, Primary motor cortex activation during action observation revealed by waveletanalysis of the EEG,Clinical Neurophysiology115(2004), 1760–1766.

[21] N. Griffith and P. Todd, Musical Networks: Parallel Distributed Perception and Performance, Cambridge, Mass: MIT(1999), ISBN: 0262071819.

[22] B.Ph. van Milligen, C. Hidalgo and E. Sanchez, Nonlinear phenomena and intermittency in plasma turbulence,Phys RevLetters74(3) (1995), 395–398.

TURBO_X
Typewritten Text
TURBO_X
Typewritten Text
[20] -> [21]
TURBO_X
Typewritten Text
TURBO_X
Typewritten Text
[21] -> [20]