Top Banner
Fundamentals of Computational Neuroscience 2e Thomas Trappenberg March 28, 2009 Chapter 10: The cognitive brain
13

Fundamentals of Computational Neuroscience 2eweb.cs.dal.ca/~tt/CSCI650809/SlidesChapter10.pdf · Fundamentals of Computational Neuroscience 2e Thomas Trappenberg March 28, ... Posterior

Aug 26, 2018

Download

Documents

dodieu
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Fundamentals of Computational Neuroscience 2eweb.cs.dal.ca/~tt/CSCI650809/SlidesChapter10.pdf · Fundamentals of Computational Neuroscience 2e Thomas Trappenberg March 28, ... Posterior

Fundamentals of ComputationalNeuroscience 2e

Thomas Trappenberg

March 28, 2009

Chapter 10: The cognitive brain

Page 2: Fundamentals of Computational Neuroscience 2eweb.cs.dal.ca/~tt/CSCI650809/SlidesChapter10.pdf · Fundamentals of Computational Neuroscience 2e Thomas Trappenberg March 28, ... Posterior

Hierarchical maps and attentive vision

Posterior

Inferior

V4

V2

V1

LGN

temporal cortex

ocipital cortex

thalamus

}

A. Ventral visual pathway B. Layered cortical maps

Eccentricity / deg

Rec

eptiv

e Fi

eld

Size

/ de

g

50

20

8.0

3.2

1.3

50208.03.21.30

Layer 4

Layer 3

Layer 2

Layer 1Layer 1

Layer 4

Layer 2

Layer 3

Page 3: Fundamentals of Computational Neuroscience 2eweb.cs.dal.ca/~tt/CSCI650809/SlidesChapter10.pdf · Fundamentals of Computational Neuroscience 2e Thomas Trappenberg March 28, ... Posterior

Attention in visual search and object recognition

Given :

Particular Features ( Target Object )

Function :

Scanning ( Attentional Window Scanns the

Entire Scene )

WHERE

Visual search

Given :

Particular features ( target object )

Function :

Scanning ( attentional window scans the

entire scene )

WHERE

Object Recognition

Given :

Particular Spatial Location ( Target Position)

Function :

Binding ( Attentional Windo Bind Features

for Identification )

WHAT

Object recognition

Given :

Particular spatial location ( target position)

Function :

Binding ( attentional window binds features

for identification )

WHAT

Gustavo Deco

Page 4: Fundamentals of Computational Neuroscience 2eweb.cs.dal.ca/~tt/CSCI650809/SlidesChapter10.pdf · Fundamentals of Computational Neuroscience 2e Thomas Trappenberg March 28, ... Posterior

Model

Inhibitory pool

Inhibitory pool

....

....

Visual field

Inhibitory Pool

Inhibitory pool

( )

„ “

„ “

Locus attentional preferred

Gabor jets

IT ( Object recognition )

PP Spatial location

V1 V4 (Feature

extraction )

LGN

Where

What

Top down bias ( Object specific ) Top down bias

( Location specific )

Page 5: Fundamentals of Computational Neuroscience 2eweb.cs.dal.ca/~tt/CSCI650809/SlidesChapter10.pdf · Fundamentals of Computational Neuroscience 2e Thomas Trappenberg March 28, ... Posterior

Example results

E X

X

Time

Number of items 1 2 3

PP

Number of items 1 2 3

PP

E F F

E F F

Time

Act

ivity

2 3

Act

ivity

A. “Parallel search” B. “Serial search”

Page 6: Fundamentals of Computational Neuroscience 2eweb.cs.dal.ca/~tt/CSCI650809/SlidesChapter10.pdf · Fundamentals of Computational Neuroscience 2e Thomas Trappenberg March 28, ... Posterior

The interconnecting workspace hypothesis

Globalworkspace

Evaluativesystem

(VALUE)

Long-termmemory(PAST)

Attentionalsystem

(Focusing)

Perceptual system (PRESENT)

Motorsystem

(FUTURE)

Page 7: Fundamentals of Computational Neuroscience 2eweb.cs.dal.ca/~tt/CSCI650809/SlidesChapter10.pdf · Fundamentals of Computational Neuroscience 2e Thomas Trappenberg March 28, ... Posterior

Stroop task modelling

A. Stroop task B. Workspace model for stroop task

grey

black

word naming

colournaming

image

task

grey

greyblack

black

COLOURblack

NAMING RESPONSE black

INPUTS & OUTPUTS

SPECIALIZED PROCESSORS

WORKSPACE NEURONS

REWARD(error signal)

VIGILANCE

attentionalsupression

of word

attentionalamplification

of colour

WORDgrey

Page 8: Fundamentals of Computational Neuroscience 2eweb.cs.dal.ca/~tt/CSCI650809/SlidesChapter10.pdf · Fundamentals of Computational Neuroscience 2e Thomas Trappenberg March 28, ... Posterior

The anticipating brain

1. The brain can develop a model of the world, which can be usedto anticipate or predict the environment.

2. The inverse of the model can be used to recognize causes byevoking internal concepts.

3. Hierarchical representations are essential to capture the richnessof the world.

4. Internal concepts are learned through matching the brain’shypotheses with input from the world.

5. An agent can learn actively by testing hypothesis throughactions.

6. The temporal domain is an important degree of freedom.

Page 9: Fundamentals of Computational Neuroscience 2eweb.cs.dal.ca/~tt/CSCI650809/SlidesChapter10.pdf · Fundamentals of Computational Neuroscience 2e Thomas Trappenberg March 28, ... Posterior

Agent Environment

) | c, a ( s p

) | ( a , s p a

,c ) | ( s p c ) | ( a c p External states

PNS Sensation

PNS Action

) |s , c ( s p

) ( p a

CNS Action

Internal states

c ,c ) | ( p c

CNS Sensation

|s , c

Page 10: Fundamentals of Computational Neuroscience 2eweb.cs.dal.ca/~tt/CSCI650809/SlidesChapter10.pdf · Fundamentals of Computational Neuroscience 2e Thomas Trappenberg March 28, ... Posterior

Recurrent networks with hidden nodes

The Boltzmann machine:

Hiddennodes

Visiblenodes

Energy: Hnm = − 12

∑ij wijsn

i smj

Probabilistic update: p(sni = +1) = 1

1+exp(−βP

j wij snj )

Boltzmann-Gibbs distribution: p(sv ; w) = 1Z

∑m∈h exp(−βHvm)

Page 11: Fundamentals of Computational Neuroscience 2eweb.cs.dal.ca/~tt/CSCI650809/SlidesChapter10.pdf · Fundamentals of Computational Neuroscience 2e Thomas Trappenberg March 28, ... Posterior

Training Boltzmann machine

Kulbach-Leibler divergence

KL(p(sv ),p(sv ; w)) =v∑s

p(sv ) logp(sv )

p(sv ; w)

=v∑s

p(sv ) log p(sv )−v∑s

p(sv ) log p(sv ; w)

Minimizing KL is equivalent to maximizing the average log-likelihoodfunction

l(w) =v∑s

p(sv ) log p(sv ; w) = 〈log p(sv ; w)〉.

Gradient decent→ Boltzmann Learning∆wij = η ∂l

∂wij= η β2

(〈sisj〉clamped − 〈sisj〉free

).

Page 12: Fundamentals of Computational Neuroscience 2eweb.cs.dal.ca/~tt/CSCI650809/SlidesChapter10.pdf · Fundamentals of Computational Neuroscience 2e Thomas Trappenberg March 28, ... Posterior

The restricted Boltzmann machine

Hidden nodes

Visible nodes

Boltzmann machine Restricted Boltzmann machine

Contrastive Hebbian learning: Alternating Gibbs sampling

t=1 t=2 t=3 t= 8

Page 13: Fundamentals of Computational Neuroscience 2eweb.cs.dal.ca/~tt/CSCI650809/SlidesChapter10.pdf · Fundamentals of Computational Neuroscience 2e Thomas Trappenberg March 28, ... Posterior

Model retina

RBM layers

Recognition readout and stimulation

Image input

Concept input