Abstract
There are many learning algorithms of layered neural networks. Recently, as one of them, Kimura et at.improved the back-propagation algorithm by considering that there is mutual correction among the weight and bias directly connected to the unit based on an extended Kalman filter. On the other hand, Rumelhart et at.attempted to introduce inertia terms into the back-propagation algorithm for the purpose of accelerating learning. In this paper, the introduction of the inertia terms into the algorithm proposed by Kimura et at.is considered and the effect of introducing the inertia terms is examined.