An Introduction to Complex-Valued Recurrent Correlation Neural Networks Marcos Eduardo Valle Department of Applied Mathematics Institute of Mathematics, Statistics, and Scientific Computing University of Campinas - Brazil July 11, 2014 Marcos Eduardo Valle (Brazil) CV-RCNNs July 11, 2014 1 / 23
31
Embed
An Introduction to Complex-Valued Recurrent Correlation ...valle/PDFfiles/Talk_IJCNN2014.pdf · Introduction Complex-Valued Hopfield Networks Low storage capacity! Hopfield Network
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
An Introduction to Complex-Valued RecurrentCorrelation Neural Networks
Marcos Eduardo Valle
Department of Applied MathematicsInstitute of Mathematics, Statistics, and Scientific Computing
University of Campinas - Brazil
July 11, 2014
Marcos Eduardo Valle (Brazil) CV-RCNNs July 11, 2014 1 / 23
for some (fixed) continuous nondecreasing function f : [−n,n]→ R.
Theorem (Convergence of the CV-RCNNs)The sequence produced by a CV-RCNN always converges.
Marcos Eduardo Valle (Brazil) CV-RCNNs July 11, 2014 12 / 23
Examples:
The correlation CV-RCNN is obtained by considering
fc(x) = x/n.
The exponential CV-RCNN is obtained considering
fe(x) = eαx/n, α > 0.
The high-order CV-RCNN is obtained by considering
fh(x) = (1 + x/n)q, q > 1 is an integer.
The potential-function CV-RCNN is obtained by considering
fp(x) = 1/(1− x/n)L, L ≥ 1.
Marcos Eduardo Valle (Brazil) CV-RCNNs July 11, 2014 13 / 23
Experiments: Exponential CV-RCNN
0
2
4
6
8
10
12
14
16
0 2 4 6 8 10 12 14 16
Out
put E
rror
Input Error
alpha=1alpha=3alpha=5
alpha=10alpha=20alpha=30
Marcos Eduardo Valle (Brazil) CV-RCNNs July 11, 2014 14 / 23
Experiments: High-order CV-RCNN
0
2
4
6
8
10
12
14
16
0 2 4 6 8 10 12 14 16
Out
put E
rror
Input Error
q=2q=3q=5
q=10q=20q=30
Marcos Eduardo Valle (Brazil) CV-RCNNs July 11, 2014 15 / 23
Experiments: Potential-function CV-RCNN
0
2
4
6
8
10
12
14
16
0 2 4 6 8 10 12 14 16
Out
put E
rror
Input Error
L=1L=3L=5
L=10L=20L=30
Marcos Eduardo Valle (Brazil) CV-RCNNs July 11, 2014 16 / 23
Comparison between the four CV-RCNNs
Parameters: q = 10,L = 10, and α = 10.
0
2
4
6
8
10
12
14
16
0 2 4 6 8 10 12 14 16
Out
put E
rror
Input Error
CorrelationHigh-Order
PotentialExponential
Marcos Eduardo Valle (Brazil) CV-RCNNs July 11, 2014 17 / 23
General remarks:
The high-order CV-RCNN and the exponential CV-RCNN failed toperfectly recall a fundamental vector for small values of theparameters q and α, i.e., for q, α ≤ 5.The error correction capability of the high-order and exponentialCV-RCNNs have similar dependence on the parameters.The vector recalled by the potential-function CV-RCNN is alwaysvery similar to the desired fundamental vector u1.The correlation CV-RCNN failed to retrieve the fundamental vectoru1 in many steps.The high-order, potential-function, and exponential CV-RCNNs,besides apparently giving perfect recall of undistorted vectors,exhibited similar error capability for large values of theirparameters, i.e., for q,L, α ≥ 10.
In the following, we focus on the exponential CV-RCNN.
Marcos Eduardo Valle (Brazil) CV-RCNNs July 11, 2014 18 / 23
On the Effect of the Activation Function
Let us compare the effect of the three activation functions σ, csgn, andcsgm in the exponential CV-RCNN with α = 10.
For p = 12, the three variations of the exponential CV-RCNNalways recalled one of the fundamental vectors.We increased considerably the number of fundamental vectorsfrom 12 to 4096.
Marcos Eduardo Valle (Brazil) CV-RCNNs July 11, 2014 19 / 23
Probability of Successful Recall
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Rec
all P
roba
bilit
y
Epsilon
csgmcsgn
sigma
Marcos Eduardo Valle (Brazil) CV-RCNNs July 11, 2014 20 / 23
Recall Phase of the Exponential CV-RCNN
Number of times the exponential CV-RCNN recalls the vector that iscloser to the input vector using the Euclidean distance.
0
0.2
0.4
0.6
0.8
1
12 4096
alpha=10alpha=20alpha=30
Marcos Eduardo Valle (Brazil) CV-RCNNs July 11, 2014 21 / 23
The storage capacity of the exponential CV-RCNN visually scalesexponentially with the length n of the vectors.
Marcos Eduardo Valle (Brazil) CV-RCNNs July 11, 2014 22 / 23
Concluding Remarks
We generalized the bipolar RCNNs using complex.A CV-RCNNs is characterized by a continuous non-decreasingfunction f .The sequence produced by a CV-RCNN always converges to astationary state.Preliminary computational experiments revealed that the storagecapacity of the exponential CV-RCNN scales exponentially withthe length of the stored vectors.
A detailed account on CV-RCNNs have been submitted for publicationon IEEE TNNLS.
Thank you!
Marcos Eduardo Valle (Brazil) CV-RCNNs July 11, 2014 23 / 23