Top Banner
1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor functions Thomas Trappenberg Thomas Trappenberg Dalhousie University, Canada Dalhousie University, Canada
38

1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

Dec 27, 2015

Download

Documents

Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

1

On Bubbles and Drifts:Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor functions

Thomas TrappenbergThomas Trappenberg

Dalhousie University, Canada Dalhousie University, Canada

Page 2: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

2

CANNs can implement motor functions

Stringer, Rolls, Trappenberg, de Araujo, Self-organizing continuous attractor networks and motor functions Neural Networks 16 (2003).

State nodes Motor nodes

Movement selector nodes

Page 3: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

3

My plans for this talk

Basic CANN model

Idiothetic CANN updates (path-integtration)

CANN & motor functions

Limits on NMDA stabilization

Page 4: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

4

Once upon a time ... (my CANN shortlist)

Wilson & Cowan (1973) Grossberg (1973) Amari (1977) … Sampolinsky & Hansel (1996) Zhang (1997) … Stringer et al (2002)

Wilshaw & van der Malsburg (1976)

Droulez & Berthos (1988)

Redish, Touretzky, Skaggs, etc

Page 5: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

5

Basic CANN: It’s just a `Hopfield’ net …

I ext rout

w

w

x

Recurrent architecture Synaptic weights

Nodes can be scrambled!

Page 6: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

6

In mathematical terms …

Updating network states (network dynamics)

Gain function

Weight kernel

Page 7: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

7Network can form bubbles of persistent activity (in Oxford English: activity packets)

0 5 10 15 20

20

40

60

80

100

Time [t]

Nod

e in

dex

External stimulus

End states

Page 8: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

8Space is represented with activity packets in the hippocampal system

From Samsonovich & McNaughtonPath integration and cognitive mapping in a continuous attractor neural J. Neurosci. 17 (1997)

Page 9: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

9

Various gain functions are used

End states

Page 10: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

10Superior colliculus intergrates exogenous and endogenous inputs

C N

S N p r

T h a l

S E F

F E F

L IP

S C

R F

Cerebellum

Page 11: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

11

Superior Colliculus is a CANN

Trappenberg, Dorris, Klein & Munoz,A model of saccade initiation based

on the competitive integration of exogenous and endogenous inputs

J. Cog. Neuro. 13 (2001)

Page 12: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

12Weights describe the effective interaction in Superior Colliculus

Trappenberg, Dorris, Klein & Munoz,A model of saccade initiation based on the competitive integration of exogenous and endogenous inputs J. Cog. Neuro. 13 (2001)

Page 13: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

13There are phase transitions in the weight-parameter space

Page 14: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

14

CANNs can be trained with Hebb

Hebb:

Training pattern:

Page 15: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

15Normalization is important to have convergent method

• Random initial states• Weight normalization

w(x,50)

Training timex

x y

w(x,y)

Page 16: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

16Gradient-decent learning is also possible (Kechen Zhang)

Gradient decent with regularization = Hebb + weight decay

Page 17: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

17

CANNs have a continuum of point attractors

Point attractors and basin of attraction

Line of point attractors

Can be mixed: Rolls, Stringer, Trappenberg A unified model of spatial and episodic memoryProceedings B of the Royal Society 269:1087-1093 (2002)

Page 18: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

18

CANNs work with spiking neurons

Xiao-Jing Wang, Trends in Neurosci. 24 (2001)

Page 19: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

19

Shutting-off works also in rate model

Time

No

de

Page 20: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

20

CANN (integrators) are stiff

Page 21: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

21

… and can drift and jumpTrappenberg, Dynamic cooperation and competition in a network of spiking neuronsICONIP'98

Page 22: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

22

Neuroscience applications of CANNs

Persistent activity (memory) and winner-takes-all (competition)

• Cortical network (e.g. Wilson & Cowan, Sampolinsky, Grossberg)

• Working memory (e.g. Compte, Wang, Brunel, Amit (?), etc)

• Oculomotor programming (e.g. Kopecz & Schoener, Trappenberg et al.)

• Attention (e.g. Sompolinsky, Olshausen, Salinas & Abbott (?), etc)

• Population decoding (e.g. Wu et al, Pouget, Zhang, Deneve, etc )

• SOM (e.g. Wilshaw & van der Malsburg)

• Place and head direction cells (e.g. Zhang, Redish, Touretzky, Samsonovitch, McNaughton, Skaggs, Stringer et al.)

• Motor control (Stringer et al.)

basic

CANN

PI

Path-integration

Page 23: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

23Modified CANN solves path-integration

Page 24: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

24

CANNs can implement motor functions

Stringer, Rolls, Trappenberg, de Araujo, Self-organizing continuous attractor networks and motor functions Neural Networks 16 (2003).

State nodes Motor nodes

Movement selector nodes

Page 25: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

25

... learning motor sequences (e.g. speaking a work)

Movement selector cells motor cells state cells

Experiment 1

Page 26: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

26

… from noisy examples …

state cells:learning from noisy examples

Experiment 2

Page 27: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

27

… and reaching from different initial states

Stringer, Rolls, Trappenberg, de Araujo,Self-organizing continuous attractor networks and motor function

Neural Networks 16 (2003).

Experiment 3

Page 28: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

28

Drift is caused by asymmetries

NMDA stabilization

Page 29: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

29

CANN can support multiple packets

Stringer, Rolls & Trappenberg,Self-organising continuous attractor networks with multiple

Activity packets, and the representation of spaceNeural Networks 17 (2004)

Page 30: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

30

How many activity packets can be stable?

Trappenberg, Why is our working memory capacity so large?Neural Information Processing-Letters and Reviews, Vol. 1 (2003)

Page 31: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

31

Stabilization can be too strong

Trappenberg & Standage,Multi-packet regions in stabilized continuousattractor networks, submitted to CNS’04

Page 32: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

32

Conclusion

CANN are widespread in neuroscience models (brain)

Short term memory, feature selectivity (WTA)

`Path-integration’ is an elegant mechanisms to generate dynamic sequences (self-organized)

Page 33: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

33

With thanks to

Cognitive Neuroscience, Oxford Univ. Edmund Rolls Simon Stringer Ivan Araujo

Psychology, Dalhousie Univ. Ray Klein

Physiology, Queen’s Univ. Doug Munoz Mike Dorris

Computer Science, Dalhousie Dominic Standage

Page 34: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

34

CANN can discover dimensionality

Page 35: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

35CANN with adaptive input strength explains express saccades

Page 36: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

36CANN are great for population decoding (fast pattern matching implementation)

Page 37: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

37

John Lisman’s hippocampus

Page 38: 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.

38

0

1 1

( )( )( ) ( )

hdhd vii ihd

ac hd acij jhd c hd ac

j

inh hdij j

j

c hd cij j

j

dh th t Iw w r t

dt C

w rw r rC C

r

t

t

C

: activity of node i

: firing rate

: synaptic efficacy matrix

: global inhibition

: visual input

: time constant

: scaling factor

: #connections per node

: slope

: threshold

viI

ih

0

inhw

ir

ijw

Continuous dynamic (leaky integrator):

The model equations:

NMDA-style stabilization:1

2

if 0.5( )

elsewherei

i

rr

2 ( ( )) 1(1 e )i ih rir

ij i jw kr r Hebbian learning:

c hd hd cij i jw kr r r ac hd hd acij i jw kr r r