IEEJ Transactions on Electronics, Information and Systems
Online ISSN : 1348-8155
Print ISSN : 0385-4221
ISSN-L : 0385-4221
<Softcomputing, Learning>
A Hardware Implementation of Back Propagation for Block-based Pulsed Neural Networks
Kenta HagioSeiichi KoakutsuTakashi Okamoto
Author information
JOURNAL FREE ACCESS

2016 Volume 136 Issue 8 Pages 1230-1236

Details
Abstract

Evolvable Hardware (EHW) is reconfigurable hardware which can adopt to unknown new environments. EHW can be implemented combining learning networks such as Neural Networks (NNs) and programmable devices such as FPGA (Field Programmable Gate Array). As such research of EHW, Block-Based Neural Networks (BBNNs) have been proposed. BBNNs have simplified network structures and have been attracting attention with their ease of hardware implementation. In particular, Block-Based Pulsed Neural Networks (BBPNNs) which adopt a pulsed neuron model instead of an analogue neuron model in BBNNs have been proposed in order to solve the problem that BBNNs use many multiplier circuits and require large scale hardware resources for implementation. In addition, applying Back Propagation (BP) which is common learning algorithm of NNs to BBPNNs has been proposed. In this paper, we propose two approximation methods in order to reduce hardware resources which are necessary to apply BP to BBPNNs. In the proposed methods, we approximate input values and derivative values of activation function in BP. Results of computational experiments indicate the validity of the proposed methods.

Content from these authors
© 2016 by the Institute of Electrical Engineers of Japan
Previous article Next article
feedback
Top