IEEJ Transactions on Electronics, Information and Systems
Online ISSN : 1348-8155
Print ISSN : 0385-4221
ISSN-L : 0385-4221
<Neural Network, Fuzzy and Chaos Systems>
Fast Back Propagation Learning Using Optimization of Learning Rate for Pulsed Neural Networks
Kenji YamamotoSeiichi KoakutsuTakashi OkamotoHironori Hirata
Author information
JOURNAL FREE ACCESS

2008 Volume 128 Issue 7 Pages 1137-1142

Details
Abstract
Neural Networks (NN) are widely applied to information processing because of its nonlinear processing capability. Digital hardware implementation of NN seems to be effective in construction of NN systems in which real-time operation and much further wide applications are possible. However, the digital hardware implementation of analogue NN is very difficult because we have to fulfill the restrictions about circuit resource, such as circuit scale, arrangement, and wiring. A technique that uses pulsed neuron model instead of analogue neuron model as a method of solving this problem has been proposed, and its effectiveness has been confirmed. To construct Pulsed Neural Networks (PNN), Back Propagation (BP) learning has been proposed. However, BP learning takes much time to construct PNN compared with the learning of analogue NN. Therefore some method to speed up BP learning of PNN is necessary. In this paper, we propose a fast BP learning using optimization of learning rate for PNN. In the proposed method, the learning rate is optimized so as to speed up the learning at every learning epoch. To evaluate the proposed method, we apply it to some pattern recognition problems, such as XOR, 3-bits parity, and digit recognition. Results of computational experiments indicate the validity of the proposed method.
Content from these authors
© 2008 by the Institute of Electrical Engineers of Japan
Previous article Next article
feedback
Top