Tutorial: Plasticity Revisited - Motivating New Algorithms Based On Recent Neuroscience Research Tsvi Achler Tsvi Achler MD/PhD MD/PhD Approximate Outline and References for Tutorial Department of Computer Science Department of Computer Science University of Illinois at Urbana-Champaign, University of Illinois at Urbana-Champaign, Urbana, IL 61801, U.S.A. Urbana, IL 61801, U.S.A.
84
Embed
Tutorial: Plasticity Revisited - Motivating New Algorithms Based On Recent Neuroscience Research
Tutorial: Plasticity Revisited - Motivating New Algorithms Based On Recent Neuroscience Research. Tsvi Achler MD/PhD. Approximate Outline and References for Tutorial. Department of Computer Science University of Illinois at Urbana-Champaign, Urbana, IL 61801, U.S.A. Intrinsic. Plasticity:. - PowerPoint PPT Presentation
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Tutorial: Plasticity Revisited -Motivating New Algorithms Based On
Recent Neuroscience Research
Tsvi Achler Tsvi Achler MD/PhDMD/PhD
Approximate Outline and
References for Tutorial
Department of Computer ScienceDepartment of Computer Science
University of Illinois at Urbana-Champaign, Urbana, IL 61801, U.S.A.University of Illinois at Urbana-Champaign, Urbana, IL 61801, U.S.A.
Outline
1. Plasticity is observed in many forms. We review experiments and controversies. • Intrinsic ‘membrane plasticity’
• Synaptic
• Homeostatic ‘feedback plasticity’
• System: in combination membrane and feedback can imitate synaptic
2. What does this mean for NN algorithms?
IntrinsicSynaptic
Homeostatic‘Systems’
Plasticity:
Outline: NN Algorithms
Common computational Issues
• Explosion in connectivity
• Explosion in training
• How can nature solve these problems with the plasticity mechanisms outlined?
Synaptic PlasticityLateral Inhibition
Feedback Inhibition
Algorithms:
1. Plasticity
Intrinsic ‘Membrane’ Plasticity
• Ion channels responsible for activity, spikes
• ‘Plastic’ ion channels found in membrane
• Voltage sensitive channel types: – (Ca++, Na+, K+)
• Plasticity independent of synapse plasticity
Review:G. Daoudal, D, Debanne, Long-Term Plasticity of Intrinsic Excitability:
Learning Rules and Mechanisms, Learn. Mem. 2003 10: 456-465
Intrinsic
Synaptic Plasticity Hypothesis
• Bulk of studies
• Synapse changes with activation
• Motivated by Hebb 1949
• Supported by Long Term Potentiation / Depression (LTP/LTD) experiments
Review:Malenka, R. C. and M. F. Bear (2004). "LTP and LTD: an embarrassment
of riches." Neuron 44(1): 5-21.
Synaptic
LTP/LTD Experiment Protocol
• Establish ‘pre-synaptic’ cell
• Establish ‘post-synaptic’ cell
• Raise pre-synaptic activity to amplitude to A50 where post-synaptic cell fires “50%”
• Induction: high frequency high voltage spike train on both pre & post electrodes
• Plasticity: any changes when A50 is applied
Brain
Pre-synaptic electrode
Post-synaptic electrode
50%A50 A50 ?
Synaptic
Plasticity: change in post with A50
• LTP : increased activity with A50
• LTD : decreased activity with A50
• Can last minutes hours days – Limited by how long recording is viable
Synaptic
Strongest Evidence
Systems w/minimal feedback:
• Motor, Musculature & tetanic stimulation
• Sensory/muscle junction of Aplesia Gill Siphon Reflex
• Early Development: Retina → Ocular Dominance Columns
Synaptic
Variable Evidence
Cortex, Thalamus, Sensory Systems & Hippocampus
• Basic mechanisms still controversial
60 years and 13,000 papers in pubmed
• It is difficult to establish/control when LTP or LTD occurs
Synaptic
LTP vs LTD Criteria is Variable
• Pre-Post spike timing: (Bi & Poo 1998; Markram et al. 1997)
– Pre-synaptic spike before post LTP– Post-synaptic spike before pre LTD:
• First spike in burst most important (Froemke & Dan 2002)
• Last spike most important (Wang et al. 2005)
• Frequency most important: Freq LTP (Sjöström et al. 2001; Tzounopoulos et al. 2004).
• Spikes are not necessary (Golding et al. 2002; Lisman & Spruston 2005)
Synaptic
Many factors affect LTP & LTD
• Voltage sensitive channels ie. NMDA
• Cell signaling channels ie via Ca++
• Protein dependent components
• Fast/slow
• Synaptic tagging
Synaptic
Review:Malenka, R. C. and M. F. Bear (2004). "LTP and LTD: an embarrassment
Xb Raw Input Activity Ib Input after feedbackQb Feedback x1 x2
ya
I2 I1
yb
x1 x2
b
bb
Q
XI
Feedback Inhibition
Output
Equations
Inhibition
aYi
ia
aa I
n
tYttY
)()(
b
bb
Q
XI
x1 x2
ya
I2 I1
yb
x1 x2
Q1=ya+ybQ2=yb
Feedback bX
jb tYQ )(j
= =
Ya Output ActivityXb Raw Input Activity Ib Input after feedbackQb Feedback na # connections of Ya
W
Feedback Inhibition
Output
Equations
Inhibition
Feedback
aYi
ia
aa I
n
tYttY
)()(
bX
jb tYQ )(
b
bb
Q
XI
x1 x2
ya
I2 I1
yb
x1x2
Q1=ya+ybQ2=yb
j
= =
RepeatNo OscillationsNo Chaos
Feedback Inhibition
Y1 Y2 Y3 Y4Output Nodes
Input NodesI4I1 I2 I3I1 I2 I3
Y2
Simple Connectivity
x1 x2 x3x4
W
New node only connects to its inputsAll links have same strengthSource of Connectivity Problems Source of Training Problems Inputs have positive real values indicating intensity
Feedback Inhibition
Allows Modular Combinations
I1
Y1 Y2
I1 I2
‘P’ ‘R’
Outputs
Inputs
10
11
Feedback Inhibition
Interprets Composite Patterns
Inputs 1 , 0 1 , 0
1 , 1 0 , 1
Supports Non-Binary Inputs
x1 x2
y1 y2
Inputs simultaneously supporting both outputs
Network Configuration Steady State
Inputs x1 , x2
Outputs y1 , y2Outputs
2 , 2 0 , 2
2 , 1 1 , 1
‘P’
Behaves as if there is an inhibitory connection
yet there is no direct connection between x2 & y1
( - )
‘R’
2Rs
P&R
Solution y1 y2
x1≥ x2 x1–x2 x2
x1≤ x2 0 (x1+x2)/2
Algorithm
Y2
Iterative Evaluation
I2 I1
Y1
x1 x2
Outputs
Inputs
How it Works
Feedback Inhibition Algorithm
Back
I2 I1
x1 x2
Y2 Y1Outputs
Inputs
Feedback Inhibition Algorithm
How it Works
Y2
Forward
I2 I1
Y1
x1 x2
Outputs
Inputs
Feedback Inhibition Algorithm
How it Works
Back
I2 I1
x1 x2
Y2 Y1Outputs
Inputs
Feedback Inhibition Algorithm
How it Works
I2 I1
Y2 Y1Outputs
Inputs
Active (1)
Inactive (0)
11
=
Feedback Inhibition Algorithm
How it Works
C2
I2 I1
Outputs
Inputs
Active (1)
Inactive (0)
Initially both outputs become active
Feedback Inhibition Algorithm
How it Works
C2
I2
Active (1)
Inactive (0)
Outputs
Inputs
I1 gets twice as much inhibition as I2
Feedback Inhibition Algorithm
How it Works
C2
I2
Active (1)
Inactive (0)
Outputs
Inputs
I1 gets twice as much inhibition as I2
Feedback Inhibition Algorithm
How it Works
Outputs
Inputs
Active (1)
Inactive (0)
Feedback Inhibition Algorithm
How it Works
Outputs
Inputs
Active (1)
Inactive (0)
This affects Y1 more than Y2
Feedback Inhibition Algorithm
How it Works
I2
Outputs
Inputs
Active (1)
Inactive (0)
This separation continues iteratively
Feedback Inhibition Algorithm
How it Works
Outputs
Inputs
Active (1)
Inactive (0)
This separation continues iteratively
Feedback Inhibition Algorithm
How it Works
I2 I1
Steady State
00
1
1 5
Act
ivit
y
Y1
Y2
Simulation Time (T)
Graph of Dynamics
32 4
Outputs
Inputs
Until the most encompassing representation predominates
11=
11
10
Y1 Y2
How it Works
‘R’
Demonstration: Appling Learned Information to New Scenarios
• Nonlinear: mathematical analysis difficult – demonstrated via examples
• Teach patterns separately
• Test novel pattern combinations
• Requires decomposition of composite
• Letter patterns are used for intuition
Demonstration
• Learn A B C … Z separately A B C D E
Teach single patterns only
A
01001 ....
B
11000 ....
C
01011 ....
D
10101 ....
E
11011 ....
…….
26 Nodes
Demonstration
Features
Nodes
…….
ModularCombination
This Defines NetworkNothing is changed or re-learned further
Demonstration
Comparison networks are trained & tested with the same patterns– Neural Networks (NN)*
* Waikato Environment for Knowledge Analysis (WEKA) repository tool for most recent and best
algorithms
Tests: Increasingly Complex• 26 patterns presented one at a time
– All methods recognize 100%
• Choose 2 letters, present simultaneously– Either: union logical-‘or’ features– add features
• Choose 4 letters, present simultaneously– Either: add or ‘or’ features– Include repeats in add case (ie ‘A+A+A+A’)
or =
A
01001 ....
B
11000 ....
A|B
11001 ....
ToNetworks+
A+B
12001 ....
325 Combinations
14,950 Combinations
456,976 Combinations
Demonstration
0102030405060708090
100
Letters Correctly Classified
% o
f co
mb
inat
ion
s
325Combinations
• Train 26 nodes • Test w/2 patterns• Do 2 top nodes match?
0/2 1/2 2/2
Two Patterns Simultaneously A B
Feedback InhibitionLateral Inhibition
or =
A
01001 ....
C
01011 ....
D
10101
.
.
.
.
E
11011 ....
A D C E
A|C|D|E
11111....
or or To Network
Four pattern union
Demonstration
Simultaneous Patterns
Union of Four Patterns :
0102030405060708090
100
Letters Correctly Classified
% o
f co
mb
inat
ion
s A B
C D
14,950Combinations
• Same 26 nodes • Test w/4 patterns• Do 4 top nodes match?
0/4 1/4 2/4 3/4 4/4
Feedback Inhibition
Lateral Inhibition
Union of Five Patterns:
0102030405060708090
100
Letters Correctly Classified
% o
f co
mb
inat
ion
s A B
C D E
65,780 Combinations
• Same 26 nodes • Test w/5 patterns• Do 5 top nodes match?
0/5 1/5 2/5 3/5 4/5 5/5
Feedback InhibitionLateral Inhibition
+ =
A
01001 ....
C
01011 ....
D
10101 ....
E
11011 ....
A D C E
A+C+D+E
23124 ....
+ + To Network
Pattern Addition
Demonstration
Improves feedback inhibition performance further
0102030405060708090
100
Letters Correctly Classified
% o
f co
mb
inat
ion
s A B
C D
0/4 1/4 2/4 3/4 4/4
Lateral InhibitionSynaptic Plasticity
Pre-Synaptic Inhibition
14,950Combinations
X C
O M
K S
A V
Same 26 nodes Test w/4 patterns•Do 4 top nodes match?
Addition of Four Patterns :
Addition of Eight Patterns:
0102030405060708090
100
Letters Correctly Classified
% o
f co
mb
inat
ion
s A G B L
C D X E
Lateral Inhibition
1,562,275 Combinations
• Same 26 nodes • Test w/8 patterns• Do 8 top nodes match?
Feedback Inhibition
0/8 1/8 2/8 3/8 4/8 5/8 6/8 7/8 8/8
• Repeated patterns reflected by
value of corresponding nodes
+ =
Nodes:
A=1
B=2
C=1
D→Z=0
A
01001 ....
C
01011 ....
A B B C
A+B+B+C
24012 ....
+ +
With Addition Feedback Algorithm Can Count
B
11000 ....
B
11000 ....
Demonstration
100% 456,976 Combinations
Tested on Random Patterns
• 50 randomly generated patterns
• From 512 features
• 4 presented at a time
• 6,250,000 combinations (including repeats)
• 100% correct including count
Demonstration
Computer starts getting slow
A+B
12001
This vectoris ‘A’ & ‘B’
together
This vectoris ‘A’ ‘C’‘D’ & ‘E’ together
A+C+D+E
23124
What if Conventional Algorithms are Trained for this Task?
Insight
• Teach pairs: 325 combinations A B
• Teach triples: 2600 combinations
• Quadruples: 14,950.
• Training complexity increases combinatorialy
A C A D A E
Y1 Y2 Y3
x1 x2 x3 x4
w11
w12
w13
w21
w22w23
w31
w32w33
w41w42
w43
M
P L
K S
A V
26 letters
Training is not practical
Furthermore ABCD can be misinterpreted as AB & CD, or ABC & D
Insight
Training Difficulty GivenSimultaneous Patterns
Y1 Y2 Y3
x1 x2 x3 x4
w11
w12
w13
w21
w22w23
w31
w32w33
w41w42
w43
Known as: ‘Superposition Catastrophe’ (Rosenblatt 1962; Rachkovskij & Kussul 2001)
A D G E
Feedback inhibition inference
seems to avoid this problem
Insight
Binding problem Simultaneous Representations
Chunking features:
Computer Algorithms
similar problems with simpler representations
Inputs
all are patterns matched
y1
x2 x3
y3
Outputs
x1
y2
x2 x1
‘Wheels’ ‘Barbell’ ‘Car Chassis’
unless the network is explicitly trained otherwise.Given:
However it is a binding error to call this a barbell.
Simultaneous Representations Cause The Binding Problem
0
0.2
0.4
0.6
0.8
1
Ve
cto
r A
cti
vit
y
Feedback Inhibition
‘Wheels’ ‘Barbell’ ‘Car Chassis’
y1 y2 y3
Binding Comparison
x1 x2 x3
y1 y2 y3
Lateral Inhibition
Inputsx1 x2 x3
y1 y2 y3
Binding: Network-Wide Solution
Outputs
1, 0, 0 1, 0, 0 1, 1, 0 0, 1, 0
Inputs Outputs
1, 1, 1 1, 0, 1
x1, x2, x3y1, y2, y3
Wheels
Barbell
Car Barbell
Network Under Dynamic Control
Recognition inseparable from attention
Feedback: an automatic way to access inputs
‘Symbolic’ control via bias
Inputsx1 x2 x3
y1 y2 y3
Symbolic Effect of Bias
Outputs
Inputs Outputs
1, 1, 1 0.02, 0.98, 0.71
x1, x2, x3 y1, y2, y3
Barbell
Interested in y2: Bias y2 = 0.15
Is barbell present? Bias y2 = 0.15
Summary
• Feedback inhibition combined with intrinsic plasticity generates a ‘systems’ plasticity that looks like synaptic plasticity
• Feedback inhibition gives algorithms more flexibility with simultaneous patterns
• Brain processing and learning is still unclear: likely a paradigm shift is needed
Acknowledgements
Eyal Amir
Cyrus Omar, Dervis Vural, Vivek Srikumar
Intelligence Community Postdoc Program & Intelligence Community Postdoc Program & National Geospatial-Intelligence AgencyNational Geospatial-Intelligence Agency