Journal of the Physical Society of Japan
Online ISSN : 1347-4073
Print ISSN : 0031-9015
ISSN-L : 0031-9015
Theory of Recurrent Neural Network with Common Synaptic Inputs
Masaki KawamuraMichiko YamanaMasato Okada
Author information
JOURNAL RESTRICTED ACCESS

2005 Volume 74 Issue 11 Pages 2961-2965

Details
Abstract
We discuss the effects of common synaptic inputs in a recurrent neural network. Because of the effects of these common synaptic inputs, the correlation between neural inputs cannot be ignored, and thus the network exhibits sample dependence. Networks of this type do not have well-defined thermodynamic limits, and self-averaging breaks down. We therefore need to develop a suitable theory without relying on these common properties. While the effects of the common synaptic inputs have been analyzed in layered neural networks, it was apparently difficult to analyze these effects in recurrent neural networks due to feedback connections. We investigated a sequential associative memory model as an example of recurrent networks and succeeded in deriving a macroscopic dynamical description as a recurrence relation form of a probability density function.
Content from these authors

This article cannot obtain the latest cited-by information.

© The Physical Society of Japan 2005
Previous article Next article
feedback
Top