Stochastic dynamics of attractor networks with quantal synaptic
noise
Paul C. Bressloff
Department of Mathematical Sciences
Loughborough University
Leics.
LE11 3TU, U.K.
We develop a nonlinear stochastic model of an attractor neural network that explicitly incorporates details concerning quantal synaptic noise. Our approach is based on a discrete-time leaky-integrator model that preserves the pulse-like nature of neuronal firing patterns thus allowing detailed modelling of synaptic processes. The basic system of equations is
where is the membrane potential of the ith neuron at time m, is a fixed external input and is a decay rate. The connection weights satisfy
where is the number of vesicles released at time m, is the size of a vesicle and is the postsynaptic efficacy. Synaptic noise is incorporated by taking to be a random variable generated according to a binomial distribution. Fluctuations on the quantal size can also occur.
We show how the dynamics can be formulated in terms of a random iterated function system. Conditions are derived for which the limiting behaviour of the system is described by an invariant probability measure; such a measure typically has a multifractal structure [1]. We then consider a continuous-time master equation obtained from the above model by taking the discrete time intervals to be drawn from a Poisson process. Introducing a global scaling parameter, we perform a small fluctuation expansion [2] to approximate the master equation by a Fokker-Planck equation. We use this to estimate hopping-rates between fixed points of the underlying attractor network and to study stochastic resonance effects [3].
Finally, we briefly discuss the possible role of synaptic noise in network learning. In particular, we show how presynaptic and postsynaptic contributions to the synaptic weights have different dynamical roles and lead to differences in the temporal behaviour of variances during learning.