1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor functions Thomas Trappenberg Thomas Trappenberg Dalhousie University, Canada Dalhousie University, Canada
38
Embed
1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
1
On Bubbles and Drifts:Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor functions
Stringer, Rolls, Trappenberg, de Araujo, Self-organizing continuous attractor networks and motor functions Neural Networks 16 (2003).
State nodes Motor nodes
Movement selector nodes
3
My plans for this talk
Basic CANN model
Idiothetic CANN updates (path-integtration)
CANN & motor functions
Limits on NMDA stabilization
4
Once upon a time ... (my CANN shortlist)
Wilson & Cowan (1973) Grossberg (1973) Amari (1977) … Sampolinsky & Hansel (1996) Zhang (1997) … Stringer et al (2002)
Wilshaw & van der Malsburg (1976)
Droulez & Berthos (1988)
Redish, Touretzky, Skaggs, etc
5
Basic CANN: It’s just a `Hopfield’ net …
I ext rout
w
w
x
Recurrent architecture Synaptic weights
Nodes can be scrambled!
6
In mathematical terms …
Updating network states (network dynamics)
Gain function
Weight kernel
7Network can form bubbles of persistent activity (in Oxford English: activity packets)
0 5 10 15 20
20
40
60
80
100
Time [t]
Nod
e in
dex
External stimulus
End states
8Space is represented with activity packets in the hippocampal system
From Samsonovich & McNaughtonPath integration and cognitive mapping in a continuous attractor neural J. Neurosci. 17 (1997)
9
Various gain functions are used
End states
10Superior colliculus intergrates exogenous and endogenous inputs
C N
S N p r
T h a l
S E F
F E F
L IP
S C
R F
Cerebellum
11
Superior Colliculus is a CANN
Trappenberg, Dorris, Klein & Munoz,A model of saccade initiation based
on the competitive integration of exogenous and endogenous inputs
J. Cog. Neuro. 13 (2001)
12Weights describe the effective interaction in Superior Colliculus
Trappenberg, Dorris, Klein & Munoz,A model of saccade initiation based on the competitive integration of exogenous and endogenous inputs J. Cog. Neuro. 13 (2001)
13There are phase transitions in the weight-parameter space
14
CANNs can be trained with Hebb
Hebb:
Training pattern:
15Normalization is important to have convergent method
• Random initial states• Weight normalization
w(x,50)
Training timex
x y
w(x,y)
16Gradient-decent learning is also possible (Kechen Zhang)
Gradient decent with regularization = Hebb + weight decay
17
CANNs have a continuum of point attractors
Point attractors and basin of attraction
Line of point attractors
Can be mixed: Rolls, Stringer, Trappenberg A unified model of spatial and episodic memoryProceedings B of the Royal Society 269:1087-1093 (2002)
18
CANNs work with spiking neurons
Xiao-Jing Wang, Trends in Neurosci. 24 (2001)
19
Shutting-off works also in rate model
Time
No
de
20
CANN (integrators) are stiff
21
… and can drift and jumpTrappenberg, Dynamic cooperation and competition in a network of spiking neuronsICONIP'98
22
Neuroscience applications of CANNs
Persistent activity (memory) and winner-takes-all (competition)
• Cortical network (e.g. Wilson & Cowan, Sampolinsky, Grossberg)
• Working memory (e.g. Compte, Wang, Brunel, Amit (?), etc)
• Oculomotor programming (e.g. Kopecz & Schoener, Trappenberg et al.)