Abstract
Recently, there are many learning algorithms of layered neural networks. As one of them, Watanabe et al. proposed a back-propagation algorithm via the Extended Kalman Filter. In this algorithm, they make the learning rate time-varying by the Extended Kalman Filter. However, Watanabe et al. could not use this filter enough, because they treated the weights and biases as independent variables. In this paper, we consider the weights and biases connected to the same unit have the mutual correlations, and propose an improved backpropagation algorithm via the Extended Kalman Filter. Then, solving the XOR problem and the parity problems, we compare the ability of our proposed algorithm with that of Watanabe et al.'s one. Furthermore, we apply the proposed learning algorithm to the classification problem of the iris data.