抄録
We propose a learning method for recurrent neural networks with dynamics. The core of this method is to keep a complexity of the network dynamics in the vicinity of the edge of chaos. To investigate the properties of the dynamics effectively and explicitly, novel stochastic parameters defined by combinations of the standard parameters such as individual connection strengths and thresholds are introduced, and then relations between complexities of the dynamics and the stochastic parameters are revealed. The standard parameters are changed by the core part based on the relations and also according to the global error measure. Some examples suggest that the method is practical one for temporal supervised learning tasks and therefore the dynamics of the edge of chaos are effectual for learning of the recurrent networks.