Page 1
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Self-organizing incremental neural network and its
application
F. Shen1 O. Hasegawa2
1National Key Laboratory for Novel Software Technology, Nanjing University
2Imaging Science and Engineering Lab, Tokyo Institute of Technology
June 14, 2009
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 2
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Contents of this tutorial
1 What is SOINN
2 Why SOINN
3 Detail algorithm of SOINN
4 SOINN for machine learning
5 SOINN for associative memory
6 References
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 3
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
What is SOINN
1 What is SOINN
2 Why SOINN
3 Detail algorithm of SOINN
4 SOINN for machine learning
5 SOINN for associative memory
6 References
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 4
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
What is SOINN
What is SOINN
SOINN: Self-organizing incremental neural network
Represent the topological structure of the input data
Realize online incremental learning
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 5
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
What is SOINN
What is SOINN
SOINN: Self-organizing incremental neural network
Represent the topological structure of the input data
Realize online incremental learning
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 6
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
What is SOINN
What is SOINN
SOINN: Self-organizing incremental neural network
Represent the topological structure of the input data
Realize online incremental learning
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 7
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
What is SOINN
What is SOINN
SOINN: Self-organizing incremental neural network
Represent the topological structure of the input data
Realize online incremental learning
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 8
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundCharacteristics of SOINN
1 What is SOINN
2 Why SOINN
3 Detail algorithm of SOINN
4 SOINN for machine learning
5 SOINN for associative memory
6 References
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 9
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundCharacteristics of SOINN
Background: Networks for topology representation
SOM(Self-Organizing Map): predefine structure and size ofthe network
NG(Neural Gas): predefine the network size
GNG(Growing Neural Gas): predefine the network size;constant learning rate leads to non-stationary result.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 10
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundCharacteristics of SOINN
Background: Networks for topology representation
SOM(Self-Organizing Map): predefine structure and size ofthe network
NG(Neural Gas): predefine the network size
GNG(Growing Neural Gas): predefine the network size;constant learning rate leads to non-stationary result.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 11
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundCharacteristics of SOINN
Background: Networks for topology representation
SOM(Self-Organizing Map): predefine structure and size ofthe network
NG(Neural Gas): predefine the network size
GNG(Growing Neural Gas): predefine the network size;constant learning rate leads to non-stationary result.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 12
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundCharacteristics of SOINN
Background: Networks for topology representation
SOM(Self-Organizing Map): predefine structure and size ofthe network
NG(Neural Gas): predefine the network size
GNG(Growing Neural Gas): predefine the network size;constant learning rate leads to non-stationary result.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 13
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundCharacteristics of SOINN
Background: Networks for incremental learning
Incremental learning: Learning new knowledge without destroyof old learned knowledge (Stability-Plasticity Dilemma)
ART(Adaptive Resonance Theory): Need a user definedthreshold.
Multilayer Perceptrons: To learn new knowledge will destroyold knowledge
Sub-network methods: Need plenty of storage
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 14
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundCharacteristics of SOINN
Background: Networks for incremental learning
Incremental learning: Learning new knowledge without destroyof old learned knowledge (Stability-Plasticity Dilemma)
ART(Adaptive Resonance Theory): Need a user definedthreshold.
Multilayer Perceptrons: To learn new knowledge will destroyold knowledge
Sub-network methods: Need plenty of storage
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 15
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundCharacteristics of SOINN
Background: Networks for incremental learning
Incremental learning: Learning new knowledge without destroyof old learned knowledge (Stability-Plasticity Dilemma)
ART(Adaptive Resonance Theory): Need a user definedthreshold.
Multilayer Perceptrons: To learn new knowledge will destroyold knowledge
Sub-network methods: Need plenty of storage
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 16
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundCharacteristics of SOINN
Background: Networks for incremental learning
Incremental learning: Learning new knowledge without destroyof old learned knowledge (Stability-Plasticity Dilemma)
ART(Adaptive Resonance Theory): Need a user definedthreshold.
Multilayer Perceptrons: To learn new knowledge will destroyold knowledge
Sub-network methods: Need plenty of storage
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 17
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundCharacteristics of SOINN
Background: Networks for incremental learning
Incremental learning: Learning new knowledge without destroyof old learned knowledge (Stability-Plasticity Dilemma)
ART(Adaptive Resonance Theory): Need a user definedthreshold.
Multilayer Perceptrons: To learn new knowledge will destroyold knowledge
Sub-network methods: Need plenty of storage
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 18
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundCharacteristics of SOINN
Characteristics of SOINN
Neurons are self-organized with no predefined networkstructure and size
Adaptively find suitable number of neurons for the network
Realize online incremental learning without any prioricondition
Find typical prototypes for large-scale data set.
Robust to noise
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 19
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundCharacteristics of SOINN
Characteristics of SOINN
Neurons are self-organized with no predefined networkstructure and size
Adaptively find suitable number of neurons for the network
Realize online incremental learning without any prioricondition
Find typical prototypes for large-scale data set.
Robust to noise
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 20
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundCharacteristics of SOINN
Characteristics of SOINN
Neurons are self-organized with no predefined networkstructure and size
Adaptively find suitable number of neurons for the network
Realize online incremental learning without any prioricondition
Find typical prototypes for large-scale data set.
Robust to noise
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 21
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundCharacteristics of SOINN
Characteristics of SOINN
Neurons are self-organized with no predefined networkstructure and size
Adaptively find suitable number of neurons for the network
Realize online incremental learning without any prioricondition
Find typical prototypes for large-scale data set.
Robust to noise
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 22
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundCharacteristics of SOINN
Characteristics of SOINN
Neurons are self-organized with no predefined networkstructure and size
Adaptively find suitable number of neurons for the network
Realize online incremental learning without any prioricondition
Find typical prototypes for large-scale data set.
Robust to noise
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 23
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundCharacteristics of SOINN
Characteristics of SOINN
Neurons are self-organized with no predefined networkstructure and size
Adaptively find suitable number of neurons for the network
Realize online incremental learning without any prioricondition
Find typical prototypes for large-scale data set.
Robust to noise
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 24
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
1 What is SOINN
2 Why SOINN
3 Detail algorithm of SOINN
4 SOINN for machine learning
5 SOINN for associative memory
6 References
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 25
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Structure: Two-layer competitive network
Two-layer competitivenetwork
First layer: Competitivefor input data
Second layer: Competitivefor output of first-layer
Output topology structureand weight vector ofsecond layer
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 26
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Structure: Two-layer competitive network
Two-layer competitivenetwork
First layer: Competitivefor input data
Second layer: Competitivefor output of first-layer
Output topology structureand weight vector ofsecond layer
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 27
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Structure: Two-layer competitive network
Two-layer competitivenetwork
First layer: Competitivefor input data
Second layer: Competitivefor output of first-layer
Output topology structureand weight vector ofsecond layer
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 28
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Structure: Two-layer competitive network
Two-layer competitivenetwork
First layer: Competitivefor input data
Second layer: Competitivefor output of first-layer
Output topology structureand weight vector ofsecond layer
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 29
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Structure: Two-layer competitive network
Two-layer competitivenetwork
First layer: Competitivefor input data
Second layer: Competitivefor output of first-layer
Output topology structureand weight vector ofsecond layer
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 30
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Training flowchart of SOINN
Adaptively updatedthreshold
Between-classinsertion
Update weight ofnodes
Within-classinsertion
Remove noise nodes
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 31
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Training flowchart of SOINN
Adaptively updatedthreshold
Between-classinsertion
Update weight ofnodes
Within-classinsertion
Remove noise nodes
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 32
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Training flowchart of SOINN
Adaptively updatedthreshold
Between-classinsertion
Update weight ofnodes
Within-classinsertion
Remove noise nodes
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 33
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Training flowchart of SOINN
Adaptively updatedthreshold
Between-classinsertion
Update weight ofnodes
Within-classinsertion
Remove noise nodes
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 34
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Training flowchart of SOINN
Adaptively updatedthreshold
Between-classinsertion
Update weight ofnodes
Within-classinsertion
Remove noise nodes
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 35
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Training flowchart of SOINN
Adaptively updatedthreshold
Between-classinsertion
Update weight ofnodes
Within-classinsertion
Remove noise nodes
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 36
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Training flowchart of SOINN
Adaptively updatedthreshold
Between-classinsertion
Update weight ofnodes
Within-classinsertion
Remove noise nodes
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 37
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
First layer: adaptively updating threshold Ti
Basic idea: within-class distance ≤ T ≤ between-class distance
1 Initialize: Ti = +∞ when node i is a new node.2 When i is winner or second winner, update Ti by
If i has neighbors, Ti is updated as the maximum distancebetween i and all of its neighbors.
Ti = maxc∈Ni
||Wi − Wc || (1)
If i has no neighbors, Ti is updated as the minimum distanceof i and all other nodes in network A.
Ti = minc∈A\{i}
||Wi − Wc || (2)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 38
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
First layer: adaptively updating threshold Ti
Basic idea: within-class distance ≤ T ≤ between-class distance
1 Initialize: Ti = +∞ when node i is a new node.2 When i is winner or second winner, update Ti by
If i has neighbors, Ti is updated as the maximum distancebetween i and all of its neighbors.
Ti = maxc∈Ni
||Wi − Wc || (1)
If i has no neighbors, Ti is updated as the minimum distanceof i and all other nodes in network A.
Ti = minc∈A\{i}
||Wi − Wc || (2)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 39
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
First layer: adaptively updating threshold Ti
Basic idea: within-class distance ≤ T ≤ between-class distance
1 Initialize: Ti = +∞ when node i is a new node.2 When i is winner or second winner, update Ti by
If i has neighbors, Ti is updated as the maximum distancebetween i and all of its neighbors.
Ti = maxc∈Ni
||Wi − Wc || (1)
If i has no neighbors, Ti is updated as the minimum distanceof i and all other nodes in network A.
Ti = minc∈A\{i}
||Wi − Wc || (2)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 40
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
First layer: adaptively updating threshold Ti
Basic idea: within-class distance ≤ T ≤ between-class distance
1 Initialize: Ti = +∞ when node i is a new node.2 When i is winner or second winner, update Ti by
If i has neighbors, Ti is updated as the maximum distancebetween i and all of its neighbors.
Ti = maxc∈Ni
||Wi − Wc || (1)
If i has no neighbors, Ti is updated as the minimum distanceof i and all other nodes in network A.
Ti = minc∈A\{i}
||Wi − Wc || (2)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 41
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
First layer: adaptively updating threshold Ti
Basic idea: within-class distance ≤ T ≤ between-class distance
1 Initialize: Ti = +∞ when node i is a new node.2 When i is winner or second winner, update Ti by
If i has neighbors, Ti is updated as the maximum distancebetween i and all of its neighbors.
Ti = maxc∈Ni
||Wi − Wc || (1)
If i has no neighbors, Ti is updated as the minimum distanceof i and all other nodes in network A.
Ti = minc∈A\{i}
||Wi − Wc || (2)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 42
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
First layer: adaptively updating threshold Ti
Basic idea: within-class distance ≤ T ≤ between-class distance
1 Initialize: Ti = +∞ when node i is a new node.2 When i is winner or second winner, update Ti by
If i has neighbors, Ti is updated as the maximum distancebetween i and all of its neighbors.
Ti = maxc∈Ni
||Wi − Wc || (1)
If i has no neighbors, Ti is updated as the minimum distanceof i and all other nodes in network A.
Ti = minc∈A\{i}
||Wi − Wc || (2)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 43
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Second layer: constant threshold Tc
Basic idea 1: within-class distance ≤ T ≤ between-classdistance
Basic idea 2: we already have some knowledge of input datafrom results of first-layer.
Within-class distance:
dw =1
NC
∑
(i ,j)∈C
||Wi − Wj || (3)
Between-class distance of two class Ci and Cj :
db(Ci ,Cj) = mini∈Ci ,j∈Cj
||Wi − Wj || (4)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 44
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Second layer: constant threshold Tc
Basic idea 1: within-class distance ≤ T ≤ between-classdistance
Basic idea 2: we already have some knowledge of input datafrom results of first-layer.
Within-class distance:
dw =1
NC
∑
(i ,j)∈C
||Wi − Wj || (3)
Between-class distance of two class Ci and Cj :
db(Ci ,Cj) = mini∈Ci ,j∈Cj
||Wi − Wj || (4)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 45
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Second layer: constant threshold Tc
Basic idea 1: within-class distance ≤ T ≤ between-classdistance
Basic idea 2: we already have some knowledge of input datafrom results of first-layer.
Within-class distance:
dw =1
NC
∑
(i ,j)∈C
||Wi − Wj || (3)
Between-class distance of two class Ci and Cj :
db(Ci ,Cj) = mini∈Ci ,j∈Cj
||Wi − Wj || (4)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 46
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Second layer: constant threshold Tc
Basic idea 1: within-class distance ≤ T ≤ between-classdistance
Basic idea 2: we already have some knowledge of input datafrom results of first-layer.
Within-class distance:
dw =1
NC
∑
(i ,j)∈C
||Wi − Wj || (3)
Between-class distance of two class Ci and Cj :
db(Ci ,Cj) = mini∈Ci ,j∈Cj
||Wi − Wj || (4)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 47
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Second layer: constant threshold Tc
Basic idea 1: within-class distance ≤ T ≤ between-classdistance
Basic idea 2: we already have some knowledge of input datafrom results of first-layer.
Within-class distance:
dw =1
NC
∑
(i ,j)∈C
||Wi − Wj || (3)
Between-class distance of two class Ci and Cj :
db(Ci ,Cj) = mini∈Ci ,j∈Cj
||Wi − Wj || (4)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 48
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Second layer: constant threshold Tc (continue)
1 Set Tc as the minimum between-class distance.
Tc = db(Ci1 ,Cj1) = mink,l=1,...,Q,k 6=l
db(Ck ,Cl) (5)
2 If Tc is less than within-class distance dw , set Tc as the nextminimum between-cluster distance.
Tc = db(Ci2 ,Cj2) = mink,l=1,...,Q,k 6=l ,k 6=i1,l 6=j1
db(Ck ,Cl ) (6)
3 Go to step 2 to update Tc until Tc is greater than dw .
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 49
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Second layer: constant threshold Tc (continue)
1 Set Tc as the minimum between-class distance.
Tc = db(Ci1 ,Cj1) = mink,l=1,...,Q,k 6=l
db(Ck ,Cl) (5)
2 If Tc is less than within-class distance dw , set Tc as the nextminimum between-cluster distance.
Tc = db(Ci2 ,Cj2) = mink,l=1,...,Q,k 6=l ,k 6=i1,l 6=j1
db(Ck ,Cl ) (6)
3 Go to step 2 to update Tc until Tc is greater than dw .
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 50
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Second layer: constant threshold Tc (continue)
1 Set Tc as the minimum between-class distance.
Tc = db(Ci1 ,Cj1) = mink,l=1,...,Q,k 6=l
db(Ck ,Cl) (5)
2 If Tc is less than within-class distance dw , set Tc as the nextminimum between-cluster distance.
Tc = db(Ci2 ,Cj2) = mink,l=1,...,Q,k 6=l ,k 6=i1,l 6=j1
db(Ck ,Cl ) (6)
3 Go to step 2 to update Tc until Tc is greater than dw .
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 51
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Second layer: constant threshold Tc (continue)
1 Set Tc as the minimum between-class distance.
Tc = db(Ci1 ,Cj1) = mink,l=1,...,Q,k 6=l
db(Ck ,Cl) (5)
2 If Tc is less than within-class distance dw , set Tc as the nextminimum between-cluster distance.
Tc = db(Ci2 ,Cj2) = mink,l=1,...,Q,k 6=l ,k 6=i1,l 6=j1
db(Ck ,Cl ) (6)
3 Go to step 2 to update Tc until Tc is greater than dw .
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 52
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Updating learning rate ǫ1(t) and ǫ2(t)
Update of weight vector
∆Ws1 = ǫ1(t)(ξ − Ws1) (7)
∆Wi = ǫ2(t)(ξ − Wi ) (∀i ∈ Ns1) (8)
After the size of network becomes stable, fine tune the network
stochastic approximation: a number of adaptation steps witha strength ǫ(t) decaying slowly but not too slowly, i.e.,∑∞
t=1 ǫ(t) = ∞, and∑∞
t=1 ǫ2(t) < ∞.
The harmonic series satisfies the conditions.
ǫ1(t) =1
t, ǫ2(t) =
1
100t(9)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 53
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Updating learning rate ǫ1(t) and ǫ2(t)
Update of weight vector
∆Ws1 = ǫ1(t)(ξ − Ws1) (7)
∆Wi = ǫ2(t)(ξ − Wi ) (∀i ∈ Ns1) (8)
After the size of network becomes stable, fine tune the network
stochastic approximation: a number of adaptation steps witha strength ǫ(t) decaying slowly but not too slowly, i.e.,∑∞
t=1 ǫ(t) = ∞, and∑∞
t=1 ǫ2(t) < ∞.
The harmonic series satisfies the conditions.
ǫ1(t) =1
t, ǫ2(t) =
1
100t(9)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 54
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Updating learning rate ǫ1(t) and ǫ2(t)
Update of weight vector
∆Ws1 = ǫ1(t)(ξ − Ws1) (7)
∆Wi = ǫ2(t)(ξ − Wi ) (∀i ∈ Ns1) (8)
After the size of network becomes stable, fine tune the network
stochastic approximation: a number of adaptation steps witha strength ǫ(t) decaying slowly but not too slowly, i.e.,∑∞
t=1 ǫ(t) = ∞, and∑∞
t=1 ǫ2(t) < ∞.
The harmonic series satisfies the conditions.
ǫ1(t) =1
t, ǫ2(t) =
1
100t(9)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 55
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Updating learning rate ǫ1(t) and ǫ2(t)
Update of weight vector
∆Ws1 = ǫ1(t)(ξ − Ws1) (7)
∆Wi = ǫ2(t)(ξ − Wi ) (∀i ∈ Ns1) (8)
After the size of network becomes stable, fine tune the network
stochastic approximation: a number of adaptation steps witha strength ǫ(t) decaying slowly but not too slowly, i.e.,∑∞
t=1 ǫ(t) = ∞, and∑∞
t=1 ǫ2(t) < ∞.
The harmonic series satisfies the conditions.
ǫ1(t) =1
t, ǫ2(t) =
1
100t(9)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 56
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Single-layer SOINN
For topologyrepresentation,first-layer is enough
Within-classinsertion slightlyhappened infirst-layer
Using subclass anddensity to judge ifconnection isneeded.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 57
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Single-layer SOINN
For topologyrepresentation,first-layer is enough
Within-classinsertion slightlyhappened infirst-layer
Using subclass anddensity to judge ifconnection isneeded.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 58
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Single-layer SOINN
For topologyrepresentation,first-layer is enough
Within-classinsertion slightlyhappened infirst-layer
Using subclass anddensity to judge ifconnection isneeded.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 59
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Single-layer SOINN
For topologyrepresentation,first-layer is enough
Within-classinsertion slightlyhappened infirst-layer
Using subclass anddensity to judge ifconnection isneeded.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 60
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Single-layer SOINN
For topologyrepresentation,first-layer is enough
Within-classinsertion slightlyhappened infirst-layer
Using subclass anddensity to judge ifconnection isneeded.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 61
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Single-layer SOINN
For topologyrepresentation,first-layer is enough
Within-classinsertion slightlyhappened infirst-layer
Using subclass anddensity to judge ifconnection isneeded.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 62
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Artificial data set: topology representation
Stationary and non-stationary
Stationary: all training data obey same distributionNon-stationary: next training sample maybe obey differentdistribution from previous one.
Original data Stationary Non-stationary
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 63
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Artificial data set: topology representation
Stationary and non-stationary
Stationary: all training data obey same distributionNon-stationary: next training sample maybe obey differentdistribution from previous one.
Original data Stationary Non-stationary
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 64
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Artificial data set: topology representation
Stationary and non-stationary
Stationary: all training data obey same distributionNon-stationary: next training sample maybe obey differentdistribution from previous one.
Original data Stationary Non-stationary
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 65
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Artificial data set: topology representation
Stationary and non-stationary
Stationary: all training data obey same distributionNon-stationary: next training sample maybe obey differentdistribution from previous one.
Original data Stationary Non-stationary
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 66
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Artificial data set: topology representation
Stationary and non-stationary
Stationary: all training data obey same distributionNon-stationary: next training sample maybe obey differentdistribution from previous one.
Original data Stationary Non-stationary
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 67
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Artificial data set: topology representation
Stationary and non-stationary
Stationary: all training data obey same distributionNon-stationary: next training sample maybe obey differentdistribution from previous one.
Original data Stationary Non-stationary
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 68
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Artificial data set: topology representation
Stationary and non-stationary
Stationary: all training data obey same distributionNon-stationary: next training sample maybe obey differentdistribution from previous one.
Original data Stationary Non-stationary
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 69
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Artificial data set: topology representation (continue)
Original data Two-layer SOINN Single-layer SOINN
Conclusion of experiments: SOINN is able to
Represent topology structure of input data.
Realize incremental learning.
Automatically learn number of nodes, de-noise, etc.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 70
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Artificial data set: topology representation (continue)
Original data Two-layer SOINN Single-layer SOINN
Conclusion of experiments: SOINN is able to
Represent topology structure of input data.
Realize incremental learning.
Automatically learn number of nodes, de-noise, etc.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 71
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Artificial data set: topology representation (continue)
Original data Two-layer SOINN Single-layer SOINN
Conclusion of experiments: SOINN is able to
Represent topology structure of input data.
Realize incremental learning.
Automatically learn number of nodes, de-noise, etc.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 72
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Artificial data set: topology representation (continue)
Original data Two-layer SOINN Single-layer SOINN
Conclusion of experiments: SOINN is able to
Represent topology structure of input data.
Realize incremental learning.
Automatically learn number of nodes, de-noise, etc.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 73
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Artificial data set: topology representation (continue)
Original data Two-layer SOINN Single-layer SOINN
Conclusion of experiments: SOINN is able to
Represent topology structure of input data.
Realize incremental learning.
Automatically learn number of nodes, de-noise, etc.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 74
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Artificial data set: topology representation (continue)
Original data Two-layer SOINN Single-layer SOINN
Conclusion of experiments: SOINN is able to
Represent topology structure of input data.
Realize incremental learning.
Automatically learn number of nodes, de-noise, etc.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 75
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Artificial data set: topology representation (continue)
Original data Two-layer SOINN Single-layer SOINN
Conclusion of experiments: SOINN is able to
Represent topology structure of input data.
Realize incremental learning.
Automatically learn number of nodes, de-noise, etc.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 76
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Artificial data set: topology representation (continue)
Original data Two-layer SOINN Single-layer SOINN
Conclusion of experiments: SOINN is able to
Represent topology structure of input data.
Realize incremental learning.
Automatically learn number of nodes, de-noise, etc.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 77
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
1 What is SOINN
2 Why SOINN
3 Detail algorithm of SOINN
4 SOINN for machine learning
5 SOINN for associative memory
6 References
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 78
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Some objectives of unsupervised learning
Automatically learn number of classes of input data
Clustering with no priori knowledge
Topology representation
Realize real-time incremental learning
Separate classes with low density overlapped area
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 79
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Some objectives of unsupervised learning
Automatically learn number of classes of input data
Clustering with no priori knowledge
Topology representation
Realize real-time incremental learning
Separate classes with low density overlapped area
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 80
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Some objectives of unsupervised learning
Automatically learn number of classes of input data
Clustering with no priori knowledge
Topology representation
Realize real-time incremental learning
Separate classes with low density overlapped area
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 81
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Some objectives of unsupervised learning
Automatically learn number of classes of input data
Clustering with no priori knowledge
Topology representation
Realize real-time incremental learning
Separate classes with low density overlapped area
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 82
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Some objectives of unsupervised learning
Automatically learn number of classes of input data
Clustering with no priori knowledge
Topology representation
Realize real-time incremental learning
Separate classes with low density overlapped area
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 83
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Some objectives of unsupervised learning
Automatically learn number of classes of input data
Clustering with no priori knowledge
Topology representation
Realize real-time incremental learning
Separate classes with low density overlapped area
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 84
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN for unsupervised learning: If two nodes connectedwith one path, the nodes belong to one class
1 Do SOINN for input data, output topology representation ofnodes
2 Initialize all nodes as unclassified.
3 Randomly choose one unclassified node i from node set A.Mark node i as classified and label it as class Ci .
4 Search A to find all unclassified nodes that are connected tonode i with a “path.” Mark these nodes as classified and labelthem as the same class as node i .
5 Go to Step3 to continue the classification process until allnodes are classified.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 85
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN for unsupervised learning: If two nodes connectedwith one path, the nodes belong to one class
1 Do SOINN for input data, output topology representation ofnodes
2 Initialize all nodes as unclassified.
3 Randomly choose one unclassified node i from node set A.Mark node i as classified and label it as class Ci .
4 Search A to find all unclassified nodes that are connected tonode i with a “path.” Mark these nodes as classified and labelthem as the same class as node i .
5 Go to Step3 to continue the classification process until allnodes are classified.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 86
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN for unsupervised learning: If two nodes connectedwith one path, the nodes belong to one class
1 Do SOINN for input data, output topology representation ofnodes
2 Initialize all nodes as unclassified.
3 Randomly choose one unclassified node i from node set A.Mark node i as classified and label it as class Ci .
4 Search A to find all unclassified nodes that are connected tonode i with a “path.” Mark these nodes as classified and labelthem as the same class as node i .
5 Go to Step3 to continue the classification process until allnodes are classified.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 87
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN for unsupervised learning: If two nodes connectedwith one path, the nodes belong to one class
1 Do SOINN for input data, output topology representation ofnodes
2 Initialize all nodes as unclassified.
3 Randomly choose one unclassified node i from node set A.Mark node i as classified and label it as class Ci .
4 Search A to find all unclassified nodes that are connected tonode i with a “path.” Mark these nodes as classified and labelthem as the same class as node i .
5 Go to Step3 to continue the classification process until allnodes are classified.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 88
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN for unsupervised learning: If two nodes connectedwith one path, the nodes belong to one class
1 Do SOINN for input data, output topology representation ofnodes
2 Initialize all nodes as unclassified.
3 Randomly choose one unclassified node i from node set A.Mark node i as classified and label it as class Ci .
4 Search A to find all unclassified nodes that are connected tonode i with a “path.” Mark these nodes as classified and labelthem as the same class as node i .
5 Go to Step3 to continue the classification process until allnodes are classified.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 89
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN for unsupervised learning: If two nodes connectedwith one path, the nodes belong to one class
1 Do SOINN for input data, output topology representation ofnodes
2 Initialize all nodes as unclassified.
3 Randomly choose one unclassified node i from node set A.Mark node i as classified and label it as class Ci .
4 Search A to find all unclassified nodes that are connected tonode i with a “path.” Mark these nodes as classified and labelthem as the same class as node i .
5 Go to Step3 to continue the classification process until allnodes are classified.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 90
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Artificial data set: 5 classes with 10% noise
Original data Clustering result
Conclusion of experiments
Automatically reports number of classes.
Perfectly clustering data with different shape and distribution.
Find typical prototypes; incremental learning; de-noise; etc.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 91
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Artificial data set: 5 classes with 10% noise
Original data Clustering result
Conclusion of experiments
Automatically reports number of classes.
Perfectly clustering data with different shape and distribution.
Find typical prototypes; incremental learning; de-noise; etc.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 92
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Artificial data set: 5 classes with 10% noise
Original data Clustering result
Conclusion of experiments
Automatically reports number of classes.
Perfectly clustering data with different shape and distribution.
Find typical prototypes; incremental learning; de-noise; etc.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 93
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Artificial data set: 5 classes with 10% noise
Original data Clustering result
Conclusion of experiments
Automatically reports number of classes.
Perfectly clustering data with different shape and distribution.
Find typical prototypes; incremental learning; de-noise; etc.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 94
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Artificial data set: 5 classes with 10% noise
Original data Clustering result
Conclusion of experiments
Automatically reports number of classes.
Perfectly clustering data with different shape and distribution.
Find typical prototypes; incremental learning; de-noise; etc.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 95
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Artificial data set: 5 classes with 10% noise
Original data Clustering result
Conclusion of experiments
Automatically reports number of classes.
Perfectly clustering data with different shape and distribution.
Find typical prototypes; incremental learning; de-noise; etc.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 96
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Face recognition: AT&T face data set
Experiment results
Automatically reports there are 10 classes.
Prototypes of every classes are reported.
With such prototypes, recognition ratio (1-NN rule) is 90%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 97
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Face recognition: AT&T face data set
Experiment results
Automatically reports there are 10 classes.
Prototypes of every classes are reported.
With such prototypes, recognition ratio (1-NN rule) is 90%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 98
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Face recognition: AT&T face data set
Experiment results
Automatically reports there are 10 classes.
Prototypes of every classes are reported.
With such prototypes, recognition ratio (1-NN rule) is 90%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 99
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Face recognition: AT&T face data set
Experiment results
Automatically reports there are 10 classes.
Prototypes of every classes are reported.
With such prototypes, recognition ratio (1-NN rule) is 90%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 100
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Face recognition: AT&T face data set
Experiment results
Automatically reports there are 10 classes.
Prototypes of every classes are reported.
With such prototypes, recognition ratio (1-NN rule) is 90%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 101
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Prototype-based classifier: based on 1-NN or k-NN rule
Nearest Neighbor Classifier (NNC): all training data asprototypesNearest Mean Classifier (NMC): mean of each class asprototypesk-means classifier (KMC), Learning Vector Quantization(LVQ), and others: predefine number of prototypes for everyclass.
Main difficulty
1 How to find enough prototypes without overfitting2 How to realize Incremental learning
Incremental of new data inside one class (non-stationary orconcept drift);Incremental of new classes.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 102
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Prototype-based classifier: based on 1-NN or k-NN rule
Nearest Neighbor Classifier (NNC): all training data asprototypesNearest Mean Classifier (NMC): mean of each class asprototypesk-means classifier (KMC), Learning Vector Quantization(LVQ), and others: predefine number of prototypes for everyclass.
Main difficulty
1 How to find enough prototypes without overfitting2 How to realize Incremental learning
Incremental of new data inside one class (non-stationary orconcept drift);Incremental of new classes.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 103
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Prototype-based classifier: based on 1-NN or k-NN rule
Nearest Neighbor Classifier (NNC): all training data asprototypesNearest Mean Classifier (NMC): mean of each class asprototypesk-means classifier (KMC), Learning Vector Quantization(LVQ), and others: predefine number of prototypes for everyclass.
Main difficulty
1 How to find enough prototypes without overfitting2 How to realize Incremental learning
Incremental of new data inside one class (non-stationary orconcept drift);Incremental of new classes.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 104
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Prototype-based classifier: based on 1-NN or k-NN rule
Nearest Neighbor Classifier (NNC): all training data asprototypesNearest Mean Classifier (NMC): mean of each class asprototypesk-means classifier (KMC), Learning Vector Quantization(LVQ), and others: predefine number of prototypes for everyclass.
Main difficulty
1 How to find enough prototypes without overfitting2 How to realize Incremental learning
Incremental of new data inside one class (non-stationary orconcept drift);Incremental of new classes.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 105
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Prototype-based classifier: based on 1-NN or k-NN rule
Nearest Neighbor Classifier (NNC): all training data asprototypesNearest Mean Classifier (NMC): mean of each class asprototypesk-means classifier (KMC), Learning Vector Quantization(LVQ), and others: predefine number of prototypes for everyclass.
Main difficulty
1 How to find enough prototypes without overfitting2 How to realize Incremental learning
Incremental of new data inside one class (non-stationary orconcept drift);Incremental of new classes.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 106
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Prototype-based classifier: based on 1-NN or k-NN rule
Nearest Neighbor Classifier (NNC): all training data asprototypesNearest Mean Classifier (NMC): mean of each class asprototypesk-means classifier (KMC), Learning Vector Quantization(LVQ), and others: predefine number of prototypes for everyclass.
Main difficulty
1 How to find enough prototypes without overfitting2 How to realize Incremental learning
Incremental of new data inside one class (non-stationary orconcept drift);Incremental of new classes.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 107
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Prototype-based classifier: based on 1-NN or k-NN rule
Nearest Neighbor Classifier (NNC): all training data asprototypesNearest Mean Classifier (NMC): mean of each class asprototypesk-means classifier (KMC), Learning Vector Quantization(LVQ), and others: predefine number of prototypes for everyclass.
Main difficulty
1 How to find enough prototypes without overfitting2 How to realize Incremental learning
Incremental of new data inside one class (non-stationary orconcept drift);Incremental of new classes.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 108
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Prototype-based classifier: based on 1-NN or k-NN rule
Nearest Neighbor Classifier (NNC): all training data asprototypesNearest Mean Classifier (NMC): mean of each class asprototypesk-means classifier (KMC), Learning Vector Quantization(LVQ), and others: predefine number of prototypes for everyclass.
Main difficulty
1 How to find enough prototypes without overfitting2 How to realize Incremental learning
Incremental of new data inside one class (non-stationary orconcept drift);Incremental of new classes.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 109
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Prototype-based classifier: based on 1-NN or k-NN rule
Nearest Neighbor Classifier (NNC): all training data asprototypesNearest Mean Classifier (NMC): mean of each class asprototypesk-means classifier (KMC), Learning Vector Quantization(LVQ), and others: predefine number of prototypes for everyclass.
Main difficulty
1 How to find enough prototypes without overfitting2 How to realize Incremental learning
Incremental of new data inside one class (non-stationary orconcept drift);Incremental of new classes.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 110
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Prototype-based classifier: based on 1-NN or k-NN rule
Nearest Neighbor Classifier (NNC): all training data asprototypesNearest Mean Classifier (NMC): mean of each class asprototypesk-means classifier (KMC), Learning Vector Quantization(LVQ), and others: predefine number of prototypes for everyclass.
Main difficulty
1 How to find enough prototypes without overfitting2 How to realize Incremental learning
Incremental of new data inside one class (non-stationary orconcept drift);Incremental of new classes.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 111
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN for supervised learning: Targets
Automatically learn the number of prototypes needed torepresent every class
Only the prototypes used to determine the decision boundarywill be remained
Realize both types of incremental learning
Robust to noise
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 112
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN for supervised learning: Targets
Automatically learn the number of prototypes needed torepresent every class
Only the prototypes used to determine the decision boundarywill be remained
Realize both types of incremental learning
Robust to noise
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 113
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN for supervised learning: Targets
Automatically learn the number of prototypes needed torepresent every class
Only the prototypes used to determine the decision boundarywill be remained
Realize both types of incremental learning
Robust to noise
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 114
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN for supervised learning: Targets
Automatically learn the number of prototypes needed torepresent every class
Only the prototypes used to determine the decision boundarywill be remained
Realize both types of incremental learning
Robust to noise
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 115
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN for supervised learning: Targets
Automatically learn the number of prototypes needed torepresent every class
Only the prototypes used to determine the decision boundarywill be remained
Realize both types of incremental learning
Robust to noise
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 116
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Adjusted SOINN Classifier (ASC)
SOINN learns k fork-means.
Noise-reduction removesnoisy prototypes
Center-cleaning removesprototypes unuseful fordecision
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 117
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Adjusted SOINN Classifier (ASC)
SOINN learns k fork-means.
Noise-reduction removesnoisy prototypes
Center-cleaning removesprototypes unuseful fordecision
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 118
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Adjusted SOINN Classifier (ASC)
SOINN learns k fork-means.
Noise-reduction removesnoisy prototypes
Center-cleaning removesprototypes unuseful fordecision
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 119
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Adjusted SOINN Classifier (ASC)
SOINN learns k fork-means.
Noise-reduction removesnoisy prototypes
Center-cleaning removesprototypes unuseful fordecision
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 120
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
ASC: noise-reduction & center-cleaning
Noise-reduction
If the label of a node differs from the label of majority voting of itsk-neighbors, it is considered an outlier.
Center-cleaning
If a prototype of class i has never been the nearest prototype ofother classes, remove the prototype.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 121
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
ASC: noise-reduction & center-cleaning
Noise-reduction
If the label of a node differs from the label of majority voting of itsk-neighbors, it is considered an outlier.
Center-cleaning
If a prototype of class i has never been the nearest prototype ofother classes, remove the prototype.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 122
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
ASC: noise-reduction & center-cleaning
Noise-reduction
If the label of a node differs from the label of majority voting of itsk-neighbors, it is considered an outlier.
Center-cleaning
If a prototype of class i has never been the nearest prototype ofother classes, remove the prototype.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 123
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
ASC: noise-reduction & center-cleaning
Noise-reduction
If the label of a node differs from the label of majority voting of itsk-neighbors, it is considered an outlier.
Center-cleaning
If a prototype of class i has never been the nearest prototype ofother classes, remove the prototype.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 124
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
ASC: noise-reduction & center-cleaning
Noise-reduction
If the label of a node differs from the label of majority voting of itsk-neighbors, it is considered an outlier.
Center-cleaning
If a prototype of class i has never been the nearest prototype ofother classes, remove the prototype.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 125
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: artificial data (I)
Original data SOINN results ASC results
Test results of ASC
No. of prototypes = 6; Recognition ratio = 100%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 126
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: artificial data (I)
Original data SOINN results ASC results
Test results of ASC
No. of prototypes = 6; Recognition ratio = 100%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 127
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: artificial data (I)
Original data SOINN results ASC results
Test results of ASC
No. of prototypes = 6; Recognition ratio = 100%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 128
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: artificial data (I)
Original data SOINN results ASC results
Test results of ASC
No. of prototypes = 6; Recognition ratio = 100%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 129
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: artificial data (I)
Original data SOINN results ASC results
Test results of ASC
No. of prototypes = 6; Recognition ratio = 100%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 130
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: artificial data (II)
Original data SOINN results ASC results
Test results of ASC
No. of prototypes = 86; Recognition ratio = 98%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 131
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: artificial data (II)
Original data SOINN results ASC results
Test results of ASC
No. of prototypes = 86; Recognition ratio = 98%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 132
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: artificial data (II)
Original data SOINN results ASC results
Test results of ASC
No. of prototypes = 86; Recognition ratio = 98%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 133
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: artificial data (II)
Original data SOINN results ASC results
Test results of ASC
No. of prototypes = 86; Recognition ratio = 98%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 134
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: artificial data (II)
Original data SOINN results ASC results
Test results of ASC
No. of prototypes = 86; Recognition ratio = 98%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 135
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: artificial data (III)
Original data SOINN results ASC results
Test results of ASC
No. of prototypes = 87; Recognition ratio = 97.8%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 136
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: artificial data (III)
Original data SOINN results ASC results
Test results of ASC
No. of prototypes = 87; Recognition ratio = 97.8%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 137
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: artificial data (III)
Original data SOINN results ASC results
Test results of ASC
No. of prototypes = 87; Recognition ratio = 97.8%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 138
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: artificial data (III)
Original data SOINN results ASC results
Test results of ASC
No. of prototypes = 87; Recognition ratio = 97.8%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 139
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: artificial data (III)
Original data SOINN results ASC results
Test results of ASC
No. of prototypes = 87; Recognition ratio = 97.8%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 140
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: optdigits
ASC with different parameter sets (ad , λ), displayed with averageof 10 times training and standard deviation
Parameter set of {ad , λ}(50, 50) (25, 25) (10, 10)
recognition ratio (%) 97.7 ± 0.2 97.4 ± 0.2 97.0 ± 0.2
No. of prototypes 377 ± 12 258 ± 7 112 ± 7
Compression ratio (%) 9.9 ± 0.3 6.8 ± 0.2 2.9 ± 0.2
Compare with SVM and 1-NN
LibSVM: 1197 support vectors; Recognition ratio = 96.6%.
1-NN: best classifier (98%). All 3823 samples as prototypes.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 141
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: optdigits
ASC with different parameter sets (ad , λ), displayed with averageof 10 times training and standard deviation
Parameter set of {ad , λ}(50, 50) (25, 25) (10, 10)
recognition ratio (%) 97.7 ± 0.2 97.4 ± 0.2 97.0 ± 0.2
No. of prototypes 377 ± 12 258 ± 7 112 ± 7
Compression ratio (%) 9.9 ± 0.3 6.8 ± 0.2 2.9 ± 0.2
Compare with SVM and 1-NN
LibSVM: 1197 support vectors; Recognition ratio = 96.6%.
1-NN: best classifier (98%). All 3823 samples as prototypes.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 142
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: optdigits
ASC with different parameter sets (ad , λ), displayed with averageof 10 times training and standard deviation
Parameter set of {ad , λ}(50, 50) (25, 25) (10, 10)
recognition ratio (%) 97.7 ± 0.2 97.4 ± 0.2 97.0 ± 0.2
No. of prototypes 377 ± 12 258 ± 7 112 ± 7
Compression ratio (%) 9.9 ± 0.3 6.8 ± 0.2 2.9 ± 0.2
Compare with SVM and 1-NN
LibSVM: 1197 support vectors; Recognition ratio = 96.6%.
1-NN: best classifier (98%). All 3823 samples as prototypes.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 143
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: optdigits
ASC with different parameter sets (ad , λ), displayed with averageof 10 times training and standard deviation
Parameter set of {ad , λ}(50, 50) (25, 25) (10, 10)
recognition ratio (%) 97.7 ± 0.2 97.4 ± 0.2 97.0 ± 0.2
No. of prototypes 377 ± 12 258 ± 7 112 ± 7
Compression ratio (%) 9.9 ± 0.3 6.8 ± 0.2 2.9 ± 0.2
Compare with SVM and 1-NN
LibSVM: 1197 support vectors; Recognition ratio = 96.6%.
1-NN: best classifier (98%). All 3823 samples as prototypes.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 144
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: UCI repository data sets
Comparison results of ASC and other classifiers: recognition ratio
Data set ASC (ad , λ) NSC (σ2max) KMC (M) NNC (k) LVQ (M)
Iris 97.4 ± 0.86 96.3 ± 0.4 96.2 ± 0.8 96.7 ± 0.6 96.1 ± 0.6Breast cancer 97.4 ± 0.38 97.2 ± 0.2 95.9 ± 0.3 97.0 ± 0.2 96.3 ± 0.4Ionosphere 90.4 ± 0.64 91.9 ± 0.8 87.4 ± 0.6 86.1 ± 0.7 86.4 ± 0.8
Glass 73.5 ± 1.6 70.2 ± 1.5 68.8 ± 1.1 72.3 ± 1.2 68.3 ± 2.0Liver disorders 62.6 ± 0.83 62.9 ± 2.3 59.3 ± 2.3 67.3 ± 1.6 66.3 ± 1.9Pima Indians 72.0 ± 0.63 68.6 ± 1.6 68.7 ± 0.9 74.7 ± 0.7 73.5 ± 0.9
Wine 82.6 ± 1.55 75.3 ± 1.7 71.9 ± 1.9 73.9 ± 1.9 72.3 ± 1.5
Average 82.3 ± 0.93 80.4 ± 1.2 78.3 ± 1.1 81.1 ± 0.99 79.9 ± 1.2
In average, ASC has best recognition performance.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 145
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: UCI repository data sets
Comparison results of ASC and other classifiers: recognition ratio
Data set ASC (ad , λ) NSC (σ2max) KMC (M) NNC (k) LVQ (M)
Iris 97.4 ± 0.86 96.3 ± 0.4 96.2 ± 0.8 96.7 ± 0.6 96.1 ± 0.6Breast cancer 97.4 ± 0.38 97.2 ± 0.2 95.9 ± 0.3 97.0 ± 0.2 96.3 ± 0.4Ionosphere 90.4 ± 0.64 91.9 ± 0.8 87.4 ± 0.6 86.1 ± 0.7 86.4 ± 0.8
Glass 73.5 ± 1.6 70.2 ± 1.5 68.8 ± 1.1 72.3 ± 1.2 68.3 ± 2.0Liver disorders 62.6 ± 0.83 62.9 ± 2.3 59.3 ± 2.3 67.3 ± 1.6 66.3 ± 1.9Pima Indians 72.0 ± 0.63 68.6 ± 1.6 68.7 ± 0.9 74.7 ± 0.7 73.5 ± 0.9
Wine 82.6 ± 1.55 75.3 ± 1.7 71.9 ± 1.9 73.9 ± 1.9 72.3 ± 1.5
Average 82.3 ± 0.93 80.4 ± 1.2 78.3 ± 1.1 81.1 ± 0.99 79.9 ± 1.2
In average, ASC has best recognition performance.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 146
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: UCI repository data sets (continue)
Comparison results of ASC and other classifiers: compression ratio
Data set ASC (a∗
d , λ∗) NSC (σ2max
∗
) KMC (M∗) NNC (k∗) LVQ (M∗)
Iris 5.2 (6, 6) 7.3 (0.25) 8.0 (4) 100 (14) 15 (22)Breast cancer 1.4 (8, 8) 1.8 (35.0) 0.29 (1) 100 (5) 5.9 (40)Ionosphere 3.4 (15, 15) 31 (1.25) 4.0 (7) 100 (2) 6.8 (24)
Glass 13.7 (15, 15) 97 (0.005) 17 (6) 100 (1) 45 (97)Liver disorders 4.6 (6, 6) 4.9 (600) 11 (19) 100 (14) 8.4 (29)Pima Indians 0.6 (6, 6) 1.7 (2600) 1.0 (4) 100 (17) 3.4 (26)
Wine 3.2 (6, 6) 96 (4.0) 29 (17) 100 (1) 32 (57)
Average 4.6 34.2 10.0 100 16.6
In average, ASC has best compression ratio.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 147
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: UCI repository data sets (continue)
Comparison results of ASC and other classifiers: compression ratio
Data set ASC (a∗
d , λ∗) NSC (σ2max
∗
) KMC (M∗) NNC (k∗) LVQ (M∗)
Iris 5.2 (6, 6) 7.3 (0.25) 8.0 (4) 100 (14) 15 (22)Breast cancer 1.4 (8, 8) 1.8 (35.0) 0.29 (1) 100 (5) 5.9 (40)Ionosphere 3.4 (15, 15) 31 (1.25) 4.0 (7) 100 (2) 6.8 (24)
Glass 13.7 (15, 15) 97 (0.005) 17 (6) 100 (1) 45 (97)Liver disorders 4.6 (6, 6) 4.9 (600) 11 (19) 100 (14) 8.4 (29)Pima Indians 0.6 (6, 6) 1.7 (2600) 1.0 (4) 100 (17) 3.4 (26)
Wine 3.2 (6, 6) 96 (4.0) 29 (17) 100 (1) 32 (57)
Average 4.6 34.2 10.0 100 16.6
In average, ASC has best compression ratio.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 148
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Requirement of Semi-supervised learning
Labeled instances are difficult, expensive, or time consumingto obtain.
How can a system use large amount of unlabeled data withlimited labeled data to built good classifiers?
New data are continually added to an already huge database
How can a system learn new knowledge without forgettingprevious learned knowledge?
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 149
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Requirement of Semi-supervised learning
Labeled instances are difficult, expensive, or time consumingto obtain.
How can a system use large amount of unlabeled data withlimited labeled data to built good classifiers?
New data are continually added to an already huge database
How can a system learn new knowledge without forgettingprevious learned knowledge?
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 150
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Requirement of Semi-supervised learning
Labeled instances are difficult, expensive, or time consumingto obtain.
How can a system use large amount of unlabeled data withlimited labeled data to built good classifiers?
New data are continually added to an already huge database
How can a system learn new knowledge without forgettingprevious learned knowledge?
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 151
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Requirement of Semi-supervised learning
Labeled instances are difficult, expensive, or time consumingto obtain.
How can a system use large amount of unlabeled data withlimited labeled data to built good classifiers?
New data are continually added to an already huge database
How can a system learn new knowledge without forgettingprevious learned knowledge?
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 152
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Requirement of Semi-supervised learning
Labeled instances are difficult, expensive, or time consumingto obtain.
How can a system use large amount of unlabeled data withlimited labeled data to built good classifiers?
New data are continually added to an already huge database
How can a system learn new knowledge without forgettingprevious learned knowledge?
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 153
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN used for Semi-supervised learning
1 SOINN:represent topology,incremental learning;
2 Labeled data: label nodes(winner);
3 Division of a cluster
Condition of division
Rc−1 ≤ Rc&Rc > Rc+1 (10)
Rc =∑
a∈Nc
dis(wa, wc) (11)
c-1: former nodec+1: unlabeled neighbors.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 154
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN used for Semi-supervised learning
1 SOINN:represent topology,incremental learning;
2 Labeled data: label nodes(winner);
3 Division of a cluster
Condition of division
Rc−1 ≤ Rc&Rc > Rc+1 (10)
Rc =∑
a∈Nc
dis(wa, wc) (11)
c-1: former nodec+1: unlabeled neighbors.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 155
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN used for Semi-supervised learning
1 SOINN:represent topology,incremental learning;
2 Labeled data: label nodes(winner);
3 Division of a cluster
Condition of division
Rc−1 ≤ Rc&Rc > Rc+1 (10)
Rc =∑
a∈Nc
dis(wa, wc) (11)
c-1: former nodec+1: unlabeled neighbors.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 156
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN used for Semi-supervised learning
1 SOINN:represent topology,incremental learning;
2 Labeled data: label nodes(winner);
3 Division of a cluster
Condition of division
Rc−1 ≤ Rc&Rc > Rc+1 (10)
Rc =∑
a∈Nc
dis(wa, wc) (11)
c-1: former nodec+1: unlabeled neighbors.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 157
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN used for Semi-supervised learning
1 SOINN:represent topology,incremental learning;
2 Labeled data: label nodes(winner);
3 Division of a cluster
Condition of division
Rc−1 ≤ Rc&Rc > Rc+1 (10)
Rc =∑
a∈Nc
dis(wa, wc) (11)
c-1: former nodec+1: unlabeled neighbors.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 158
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment: original data
5%, 15%, or 40% overlap
training samples 500, validation samples 5,000, and testsamples 5,000
labeled samples: 10% and 20%
light blue: unlabeled data; others: labeled data
- - - - - ideal decision boundary
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 159
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment: original data
5%, 15%, or 40% overlap
training samples 500, validation samples 5,000, and testsamples 5,000
labeled samples: 10% and 20%
light blue: unlabeled data; others: labeled data
- - - - - ideal decision boundary
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 160
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment: original data
5%, 15%, or 40% overlap
training samples 500, validation samples 5,000, and testsamples 5,000
labeled samples: 10% and 20%
light blue: unlabeled data; others: labeled data
- - - - - ideal decision boundary
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 161
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment: original data
5%, 15%, or 40% overlap
training samples 500, validation samples 5,000, and testsamples 5,000
labeled samples: 10% and 20%
light blue: unlabeled data; others: labeled data
- - - - - ideal decision boundary
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 162
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment: original data
5%, 15%, or 40% overlap
training samples 500, validation samples 5,000, and testsamples 5,000
labeled samples: 10% and 20%
light blue: unlabeled data; others: labeled data
- - - - - ideal decision boundary
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 163
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment: original data
5%, 15%, or 40% overlap
training samples 500, validation samples 5,000, and testsamples 5,000
labeled samples: 10% and 20%
light blue: unlabeled data; others: labeled data
- - - - - ideal decision boundary
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 164
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results
Separate classeswith few labeledsamples.
For UCI data sets,work better thanother typicalmethods.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 165
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results
Separate classeswith few labeledsamples.
For UCI data sets,work better thanother typicalmethods.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 166
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results
Separate classeswith few labeledsamples.
For UCI data sets,work better thanother typicalmethods.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 167
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN used for active learning
Targets: Actively ask for label of some samples to label allclasses
Idea:1 Use SOINN to learn the topology structure of input data.2 Actively label the vertex nodes of every class3 Use vertex nodes to label all nodes.4 Actively label the nodes lie in the overlapped area.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 168
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN used for active learning
Targets: Actively ask for label of some samples to label allclasses
Idea:1 Use SOINN to learn the topology structure of input data.2 Actively label the vertex nodes of every class3 Use vertex nodes to label all nodes.4 Actively label the nodes lie in the overlapped area.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 169
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN used for active learning
Targets: Actively ask for label of some samples to label allclasses
Idea:1 Use SOINN to learn the topology structure of input data.2 Actively label the vertex nodes of every class3 Use vertex nodes to label all nodes.4 Actively label the nodes lie in the overlapped area.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 170
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN used for active learning
Targets: Actively ask for label of some samples to label allclasses
Idea:1 Use SOINN to learn the topology structure of input data.2 Actively label the vertex nodes of every class3 Use vertex nodes to label all nodes.4 Actively label the nodes lie in the overlapped area.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 171
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN used for active learning
Targets: Actively ask for label of some samples to label allclasses
Idea:1 Use SOINN to learn the topology structure of input data.2 Actively label the vertex nodes of every class3 Use vertex nodes to label all nodes.4 Actively label the nodes lie in the overlapped area.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 172
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN used for active learning
Targets: Actively ask for label of some samples to label allclasses
Idea:1 Use SOINN to learn the topology structure of input data.2 Actively label the vertex nodes of every class3 Use vertex nodes to label all nodes.4 Actively label the nodes lie in the overlapped area.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 173
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN used for active learning
Targets: Actively ask for label of some samples to label allclasses
Idea:1 Use SOINN to learn the topology structure of input data.2 Actively label the vertex nodes of every class3 Use vertex nodes to label all nodes.4 Actively label the nodes lie in the overlapped area.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 174
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment: artificial data set under stationaryenvironment
Original data: Four classes in all, with 10% noise.
Results: under stationary environment; 10 teacher vectors.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 175
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment: artificial data set under stationaryenvironment
Original data: Four classes in all, with 10% noise.
Results: under stationary environment; 10 teacher vectors.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 176
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment: artificial data set under stationaryenvironment
Original data: Four classes in all, with 10% noise.
Results: under stationary environment; 10 teacher vectors.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 177
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment: artificial data set under non-stationaryenvironment
16 teacher vectors are asked.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 178
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment: artificial data set under non-stationaryenvironment
16 teacher vectors are asked.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 179
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
1 What is SOINN
2 Why SOINN
3 Detail algorithm of SOINN
4 SOINN for machine learning
5 SOINN for associative memory
6 References
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 180
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Background: typical associative memory systems
Distributed Learning Associative Memory:Hopfield Network: most famous network, for auto-associativememoryBidirectional Associative Memory (BAM), forhetero-associative memory
Competitive Learning Associative MemoryKFMAM: Kohonon feature map associative memory.
Difficulties
Forget previously learned knowledge when learning newknowledge incrementally.
Storage limitation.
Memory real-valued data.
Many-to-Many associate.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 181
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Background: typical associative memory systems
Distributed Learning Associative Memory:Hopfield Network: most famous network, for auto-associativememoryBidirectional Associative Memory (BAM), forhetero-associative memory
Competitive Learning Associative MemoryKFMAM: Kohonon feature map associative memory.
Difficulties
Forget previously learned knowledge when learning newknowledge incrementally.
Storage limitation.
Memory real-valued data.
Many-to-Many associate.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 182
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Background: typical associative memory systems
Distributed Learning Associative Memory:Hopfield Network: most famous network, for auto-associativememoryBidirectional Associative Memory (BAM), forhetero-associative memory
Competitive Learning Associative MemoryKFMAM: Kohonon feature map associative memory.
Difficulties
Forget previously learned knowledge when learning newknowledge incrementally.
Storage limitation.
Memory real-valued data.
Many-to-Many associate.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 183
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Background: typical associative memory systems
Distributed Learning Associative Memory:Hopfield Network: most famous network, for auto-associativememoryBidirectional Associative Memory (BAM), forhetero-associative memory
Competitive Learning Associative MemoryKFMAM: Kohonon feature map associative memory.
Difficulties
Forget previously learned knowledge when learning newknowledge incrementally.
Storage limitation.
Memory real-valued data.
Many-to-Many associate.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 184
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Background: typical associative memory systems
Distributed Learning Associative Memory:Hopfield Network: most famous network, for auto-associativememoryBidirectional Associative Memory (BAM), forhetero-associative memory
Competitive Learning Associative MemoryKFMAM: Kohonon feature map associative memory.
Difficulties
Forget previously learned knowledge when learning newknowledge incrementally.
Storage limitation.
Memory real-valued data.
Many-to-Many associate.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 185
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Background: typical associative memory systems
Distributed Learning Associative Memory:Hopfield Network: most famous network, for auto-associativememoryBidirectional Associative Memory (BAM), forhetero-associative memory
Competitive Learning Associative MemoryKFMAM: Kohonon feature map associative memory.
Difficulties
Forget previously learned knowledge when learning newknowledge incrementally.
Storage limitation.
Memory real-valued data.
Many-to-Many associate.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 186
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Background: typical associative memory systems
Distributed Learning Associative Memory:Hopfield Network: most famous network, for auto-associativememoryBidirectional Associative Memory (BAM), forhetero-associative memory
Competitive Learning Associative MemoryKFMAM: Kohonon feature map associative memory.
Difficulties
Forget previously learned knowledge when learning newknowledge incrementally.
Storage limitation.
Memory real-valued data.
Many-to-Many associate.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 187
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Background: typical associative memory systems
Distributed Learning Associative Memory:Hopfield Network: most famous network, for auto-associativememoryBidirectional Associative Memory (BAM), forhetero-associative memory
Competitive Learning Associative MemoryKFMAM: Kohonon feature map associative memory.
Difficulties
Forget previously learned knowledge when learning newknowledge incrementally.
Storage limitation.
Memory real-valued data.
Many-to-Many associate.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 188
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Background: typical associative memory systems
Distributed Learning Associative Memory:Hopfield Network: most famous network, for auto-associativememoryBidirectional Associative Memory (BAM), forhetero-associative memory
Competitive Learning Associative MemoryKFMAM: Kohonon feature map associative memory.
Difficulties
Forget previously learned knowledge when learning newknowledge incrementally.
Storage limitation.
Memory real-valued data.
Many-to-Many associate.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 189
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Background: typical associative memory systems
Distributed Learning Associative Memory:Hopfield Network: most famous network, for auto-associativememoryBidirectional Associative Memory (BAM), forhetero-associative memory
Competitive Learning Associative MemoryKFMAM: Kohonon feature map associative memory.
Difficulties
Forget previously learned knowledge when learning newknowledge incrementally.
Storage limitation.
Memory real-valued data.
Many-to-Many associate.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 190
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Background: typical associative memory systems
Distributed Learning Associative Memory:Hopfield Network: most famous network, for auto-associativememoryBidirectional Associative Memory (BAM), forhetero-associative memory
Competitive Learning Associative MemoryKFMAM: Kohonon feature map associative memory.
Difficulties
Forget previously learned knowledge when learning newknowledge incrementally.
Storage limitation.
Memory real-valued data.
Many-to-Many associate.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 191
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Objectives of SOINN-AM
Incremental learning of memory pairs.
Robust for noise data.
Dealing with real-valued data.
Many-to-many association.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 192
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Objectives of SOINN-AM
Incremental learning of memory pairs.
Robust for noise data.
Dealing with real-valued data.
Many-to-many association.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 193
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Objectives of SOINN-AM
Incremental learning of memory pairs.
Robust for noise data.
Dealing with real-valued data.
Many-to-many association.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 194
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Objectives of SOINN-AM
Incremental learning of memory pairs.
Robust for noise data.
Dealing with real-valued data.
Many-to-many association.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 195
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Objectives of SOINN-AM
Incremental learning of memory pairs.
Robust for noise data.
Dealing with real-valued data.
Many-to-many association.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 196
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Objectives of SOINN-AM
Incremental learning of memory pairs.
Robust for noise data.
Dealing with real-valued data.
Many-to-many association.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 197
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Architecture of SOINN-AM
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 198
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Algorithms of SOINN-AM
Basic idea of memory phase
1 Combine key vector and associate vector as input data.
2 Use SOINN to learn such input data.
Basic idea of recall phase
1 Using key part of nodes to find winner node for key vector,the distance is d .
2 If d ≤ ǫ, output the associative part of winner as the recallresults.
3 If d > ǫ, report unknown for key vector.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 199
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Algorithms of SOINN-AM
Basic idea of memory phase
1 Combine key vector and associate vector as input data.
2 Use SOINN to learn such input data.
Basic idea of recall phase
1 Using key part of nodes to find winner node for key vector,the distance is d .
2 If d ≤ ǫ, output the associative part of winner as the recallresults.
3 If d > ǫ, report unknown for key vector.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 200
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Algorithms of SOINN-AM
Basic idea of memory phase
1 Combine key vector and associate vector as input data.
2 Use SOINN to learn such input data.
Basic idea of recall phase
1 Using key part of nodes to find winner node for key vector,the distance is d .
2 If d ≤ ǫ, output the associative part of winner as the recallresults.
3 If d > ǫ, report unknown for key vector.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 201
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Algorithms of SOINN-AM
Basic idea of memory phase
1 Combine key vector and associate vector as input data.
2 Use SOINN to learn such input data.
Basic idea of recall phase
1 Using key part of nodes to find winner node for key vector,the distance is d .
2 If d ≤ ǫ, output the associative part of winner as the recallresults.
3 If d > ǫ, report unknown for key vector.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 202
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Algorithms of SOINN-AM
Basic idea of memory phase
1 Combine key vector and associate vector as input data.
2 Use SOINN to learn such input data.
Basic idea of recall phase
1 Using key part of nodes to find winner node for key vector,the distance is d .
2 If d ≤ ǫ, output the associative part of winner as the recallresults.
3 If d > ǫ, report unknown for key vector.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 203
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Algorithms of SOINN-AM
Basic idea of memory phase
1 Combine key vector and associate vector as input data.
2 Use SOINN to learn such input data.
Basic idea of recall phase
1 Using key part of nodes to find winner node for key vector,the distance is d .
2 If d ≤ ǫ, output the associative part of winner as the recallresults.
3 If d > ǫ, report unknown for key vector.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 204
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Algorithms of SOINN-AM
Basic idea of memory phase
1 Combine key vector and associate vector as input data.
2 Use SOINN to learn such input data.
Basic idea of recall phase
1 Using key part of nodes to find winner node for key vector,the distance is d .
2 If d ≤ ǫ, output the associative part of winner as the recallresults.
3 If d > ǫ, report unknown for key vector.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 205
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Algorithms of SOINN-AM
Basic idea of memory phase
1 Combine key vector and associate vector as input data.
2 Use SOINN to learn such input data.
Basic idea of recall phase
1 Using key part of nodes to find winner node for key vector,the distance is d .
2 If d ≤ ǫ, output the associative part of winner as the recallresults.
3 If d > ǫ, report unknown for key vector.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 206
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Original data
Binary data
Real-valued data
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 207
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Comparison with typical AM systems
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 208
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Robustness of noise
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 209
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Many-to-Many associate testing
SOINN-AM recalls all patterns perfectly.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 210
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Architecture and basic idea of GAM
Input layer: key vectorand associate vector.
Memory layer: Memorypatterns with classes.
Associate layer: Buildassociation betweenclasses.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 211
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Architecture and basic idea of GAM
Input layer: key vectorand associate vector.
Memory layer: Memorypatterns with classes.
Associate layer: Buildassociation betweenclasses.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 212
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Architecture and basic idea of GAM
Input layer: key vectorand associate vector.
Memory layer: Memorypatterns with classes.
Associate layer: Buildassociation betweenclasses.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 213
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Architecture and basic idea of GAM
Input layer: key vectorand associate vector.
Memory layer: Memorypatterns with classes.
Associate layer: Buildassociation betweenclasses.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 214
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Features of GAM
Memory classes: not just memory patterns.
Real-valued data: not limited with binary data.
Many-to-many association: not limited with one-to-oneassociation.
Robust for noisy data.
Memory and recall temporal sequences.
Incremental learning: static patterns or temporal sequences.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 215
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Features of GAM
Memory classes: not just memory patterns.
Real-valued data: not limited with binary data.
Many-to-many association: not limited with one-to-oneassociation.
Robust for noisy data.
Memory and recall temporal sequences.
Incremental learning: static patterns or temporal sequences.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 216
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Features of GAM
Memory classes: not just memory patterns.
Real-valued data: not limited with binary data.
Many-to-many association: not limited with one-to-oneassociation.
Robust for noisy data.
Memory and recall temporal sequences.
Incremental learning: static patterns or temporal sequences.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 217
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Features of GAM
Memory classes: not just memory patterns.
Real-valued data: not limited with binary data.
Many-to-many association: not limited with one-to-oneassociation.
Robust for noisy data.
Memory and recall temporal sequences.
Incremental learning: static patterns or temporal sequences.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 218
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Features of GAM
Memory classes: not just memory patterns.
Real-valued data: not limited with binary data.
Many-to-many association: not limited with one-to-oneassociation.
Robust for noisy data.
Memory and recall temporal sequences.
Incremental learning: static patterns or temporal sequences.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 219
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Features of GAM
Memory classes: not just memory patterns.
Real-valued data: not limited with binary data.
Many-to-many association: not limited with one-to-oneassociation.
Robust for noisy data.
Memory and recall temporal sequences.
Incremental learning: static patterns or temporal sequences.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 220
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Features of GAM
Memory classes: not just memory patterns.
Real-valued data: not limited with binary data.
Many-to-many association: not limited with one-to-oneassociation.
Robust for noisy data.
Memory and recall temporal sequences.
Incremental learning: static patterns or temporal sequences.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 221
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
1 What is SOINN
2 Why SOINN
3 Detail algorithm of SOINN
4 SOINN for machine learning
5 SOINN for associative memory
6 References
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 222
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
References about SOINN
SOINN for unsupervised learning:
Furao Shen and Osamu Hasegawa, ”An Incremental Network for On-lineUnsupervised Classification and Topology Learning”, Neural Networks,Vol.19, No.1, pp.90-106, (2005)
Furao Shen, Tomotaka Ogura and Osamu Hasegawa, ”An enhancedself-organizing incremental neural network for online unsupervisedlearning”, Neural Networks, Vol.20, No.8, pp.893-903, (2007)
SOINN for Supervised learning:
Furao Shen and Osamu Hasegawa, ”A Fast Nearest Neighbor ClassifierBased on Self-organizing Incremental Neural Network”, Neural Networks,Vol.21, No.10, pp1537-1547, (2008)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 223
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
References about SOINN
SOINN for unsupervised learning:
Furao Shen and Osamu Hasegawa, ”An Incremental Network for On-lineUnsupervised Classification and Topology Learning”, Neural Networks,Vol.19, No.1, pp.90-106, (2005)
Furao Shen, Tomotaka Ogura and Osamu Hasegawa, ”An enhancedself-organizing incremental neural network for online unsupervisedlearning”, Neural Networks, Vol.20, No.8, pp.893-903, (2007)
SOINN for Supervised learning:
Furao Shen and Osamu Hasegawa, ”A Fast Nearest Neighbor ClassifierBased on Self-organizing Incremental Neural Network”, Neural Networks,Vol.21, No.10, pp1537-1547, (2008)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 224
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
References about SOINN
SOINN for unsupervised learning:
Furao Shen and Osamu Hasegawa, ”An Incremental Network for On-lineUnsupervised Classification and Topology Learning”, Neural Networks,Vol.19, No.1, pp.90-106, (2005)
Furao Shen, Tomotaka Ogura and Osamu Hasegawa, ”An enhancedself-organizing incremental neural network for online unsupervisedlearning”, Neural Networks, Vol.20, No.8, pp.893-903, (2007)
SOINN for Supervised learning:
Furao Shen and Osamu Hasegawa, ”A Fast Nearest Neighbor ClassifierBased on Self-organizing Incremental Neural Network”, Neural Networks,Vol.21, No.10, pp1537-1547, (2008)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 225
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
References about SOINN
SOINN for Semi-supervised and active learning
Youki Kamiya, Toshiaki Ishii, Furao Shen and Osamu Hasegawa: ”AnOnline Semi-Supervised Clustering Algorithm Based on a Self-organizingIncremental Neural Network,” IJCNN 2007, Orlando, FL, USA, August2007
Furao Shen, Keisuke Sakurai, Youki Kamiya and Osamu Hasegawa: ”AnOnline Semi-supervised Active Learning Algorithm with Self-organizingIncremental Neural Network,” IJCNN 2007, Orlando, FL, USA, August2007
SOINN for Associative Memory:
Sudo Akihito; Sato Akihiro; Hasegawa Osamu, ”Associative Memory forOnline Learning in Noisy Environments Using Self-organizing IncrementalNeural Network”, IEEE Transactions on Neural Networks, (2009) in press
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 226
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
References about SOINN
SOINN for Semi-supervised and active learning
Youki Kamiya, Toshiaki Ishii, Furao Shen and Osamu Hasegawa: ”AnOnline Semi-Supervised Clustering Algorithm Based on a Self-organizingIncremental Neural Network,” IJCNN 2007, Orlando, FL, USA, August2007
Furao Shen, Keisuke Sakurai, Youki Kamiya and Osamu Hasegawa: ”AnOnline Semi-supervised Active Learning Algorithm with Self-organizingIncremental Neural Network,” IJCNN 2007, Orlando, FL, USA, August2007
SOINN for Associative Memory:
Sudo Akihito; Sato Akihiro; Hasegawa Osamu, ”Associative Memory forOnline Learning in Noisy Environments Using Self-organizing IncrementalNeural Network”, IEEE Transactions on Neural Networks, (2009) in press
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 227
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
References about SOINN
SOINN for Semi-supervised and active learning
Youki Kamiya, Toshiaki Ishii, Furao Shen and Osamu Hasegawa: ”AnOnline Semi-Supervised Clustering Algorithm Based on a Self-organizingIncremental Neural Network,” IJCNN 2007, Orlando, FL, USA, August2007
Furao Shen, Keisuke Sakurai, Youki Kamiya and Osamu Hasegawa: ”AnOnline Semi-supervised Active Learning Algorithm with Self-organizingIncremental Neural Network,” IJCNN 2007, Orlando, FL, USA, August2007
SOINN for Associative Memory:
Sudo Akihito; Sato Akihiro; Hasegawa Osamu, ”Associative Memory forOnline Learning in Noisy Environments Using Self-organizing IncrementalNeural Network”, IEEE Transactions on Neural Networks, (2009) in press
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
Page 228
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
References about SOINN
Download papers and program of SOINN
http://www.isl.titech.ac.jp/˜ hasegawalab/soinn.html
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application