.- Conditions for noise reduction and stable encoding of spatial structure by cortical neural networks. Boris S. Gutkin 1 and Charles E. Smith 2 1Center for Neural Basis of Cognition and Department of Mathematics, University of Pittsburgh, Pittsburgh, Pa 15217, email: [email protected]2Biomathematics Programme, Department of Statistics, North Carolina State University, Raleigh, NC 27695, email: [email protected]1. Introduction. Several works have suggested that cortical sensory networks are robust encoders that act to reduce the uncertainty in the stimuli by reducing noise (Douglas et al. 1995) while accentuating the spatial structure of the stimulus (Shamma 1989). In a recent report Amit and Brunei (1997) suggest that stimulus encoding ability of the cortex (e.g. the pre- frontal cot rex coding with attractors) takes place in the conext of high spontaneous cortical activity. The authors further suggest that given a relative balance between excitatory and inhibitory inputs impinging on a single neuron, the stochastic background is stable. In this work we seek to address the issue of robust encoding of spatialy structured stimuli by recurrent "cortical" neural networks. To do so we develop mathematical conditions for stability of such neural networks when the spatially structured stimulus is corrupted by additive noise. 1
22
Embed
Conditions for noise reduction and stable encoding of spatial
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
.-
Conditions for noise reduction and stable encoding of spatial structure by
cortical neural networks.
Boris S. Gutkin1 and Charles E. Smith2
1Center for Neural Basis of Cognition and Department of Mathematics, University of
Here the diagonal elements are the variances and the off diagonals are the spatial
covariances between the different units. Now if the eigenvalues I are all less than 1, the
second term tends to zero and we are left with
10
Now we look at a particular example of system (4.1) that is consitent with models of
sensory cortical networks exhibiting center on excitation and lateral inhibition. Let
identical connection patterns be associated with each unit in the network and let these
patterns be symmetric about the center unit, then the system under consideration is
homogeneous and symmetric with the connection matrix becoming a circulant. The
eigenvectors and the eigenvalues, can be expressed as:
(4.9) -j = [2m~ 2(N-l)m~JTX 1 e ... e
N-l 2" / NA.J = L w se nlJs
8=0
The mean steady state can now be explicitly calculated according to
N-l n.. 1 N-l(4.10) xk = L_J_e2p,jk/N where Ilk = -LPke-2fiik/N
j=O 1-A. j N k=O
The variance matrix for such a network can be readily expressed in terms of the
elements of the eigenvectors and the corresponding eigenvalues. In fact since the
eigen vectors are orthonormal, if we further restrict the eigen values to be positive the
stationary variances are less than the variance of the input. That is,
0 2(4.11) -(I-A5 )<1 'v's=I, ... ,N
2
and term-by-term the variance matrix obeys the inequality
(4.12) [vx(oo)]jj < 0;Thus we have the following result.
Theorem 3.
Let p be time invariant spatially structured input perturbed at each node of
network (4.1) by a stationary delta correlated zero mean noise.
11
Let following condition holds for the homogeneous system (4.1)
(i) All eigenvalues of connection matrix Ware less than 1 and positive.
Then the network tends to statistical stationarity. That is:
a. (4.1) has a stationary mean stEite given by (4.10)
b. The covariance tends to stationarity.
c. The steady state variances (4.11) are less than the variance of the input
data.
We now show that the condition above is consistent with near balance of inhibition and
excitation in a cortical network. Namely let us consider a connection matrix W where the
total sum of on-center excitation and off-center inhibition for each unit is some positive
number close to zero and the lateral connection patterns are sparse and weak1• We
know from matrix theory that for symmetric circulant or Toeplitz matrices the
eigenvalues are given by
N-l
Ai =I,W5 cos(21tjs/N)5=0
then the eigen values are bounded above by the sum of the weights. The Gershgorin
theorem2 states that for a matrix that is diagonal dominant (consistent with diffusely
connected network with relatively strong within column excitation and weak sparse
pericolumnar inhibition) the eigen values obey
1 This is consistent with connectivty observed in anatomical studies of a number ofcortical regions (Lund, DeFelipe, Levitt, Mason, Jones, Thompson)2 Theorem(Gershgorin or Spectral Radius for circulant matrices) (Swartz, 1973)For a diagonally dominant circulant matrix, the eigen values are contained in a circleof radius given by I,laji!, the sum of absolute values of the off-diagonal elements and
hoi
centered on lajjl, the absolute value of the diagonal element.
12
IWiil- LIWijl ~ maxAj~ Iwiil+LIWij\j ..i j ..i
Now lets take a cortical network with excitatory self-connection and inhibitory lateral
connectivity, when such a network is near a balance of excitation and inhibition we
have L Ws ::::: O. Let us now further assume that the iinhibiiton is slightly weaker than
the excitation (see footnote 1), so that the sum of the weights is positive then conditions
for Theorem 3 hold as long as LIWsl < 1. Interestingly this implies a network that dampss
any inputs as opposed to amplifying them and such a property has been recently
proposed for the primary somatosensory cortical circuits (Pinto '97).
5. Discussion.
In this work we develop analytical conditions that guarantee stable encoding of spatial
structure in the presence or random perturbations by a recurrent neural network. A
condition on the max-norm of the connection matrix and the derivative of the sigmoid
guarantees B.I.B.O. statistical stability of a time-invariant input for a general network.
For stimuli that fall into the linear region of the sigmoid we show that a condition on the
eigen values of the connection matrix results in stable steady state in the mean and
covariance. Such an assumption about the amplitude of the input is not at all
unreasonable especially in the view of a recent results by Bell at al (1996) suggesting
that for optimal information transfer in the linear region of the sigmoid. For the case of a
homogeneous symmetric neural network, of which lateral inhibitory cortical networks
are an example, we provide expressions for the mean steady state, the stationary
covariance and the eigenvalues. We also present a condition that guarantees that the
13
network will reduce the variance of the input upon reaching stationarity. Interestingly,
the general condition of section 3 assures the linear results for the symmetric,
homogeneous network. Numerical simulations have supported the analytical results.
We must point out a caveat; the conditions on the eigenvalues of the connection matrix
and the condition in section 3 are only sufficient and we have been able to observe
stable encoding by networks that do not satisfy Theorem 2 or 3. A fruitful future direction
would be to study the loss of stability as parametrized by some function of the
connection weights and construct the conditions necessary for stable encoding of
spatial features by such recurrent neural networks. Furthermore in this paper we do not
address stability with respect to noise of temporal structure of a stimulus, but only the
stability of spatial structure in a temporally invariant stimulus and we recognise that
much more work needs to be done on the noise reducing properties of recurrent
networks viz a vis temporaly structured network.
REFERENCES
Bell, A.J., Mainen, Z.F., Tsodyks, M., Sejnowski, T.J. (1995) What are the causes of
cortical variability?, presented at Neural Dynamics Workshop, Prague.
Buhmann, J., Schulten, K. (1987). Influence of Noise on the Function of a