Transactions of the Society of Instrument and Control Engineers
Online ISSN : 1883-8189
Print ISSN : 0453-4654
ISSN-L : 0453-4654
On the Capability of the Higher-Order Back-Propagation Network
Ken TANAKAMasayuki YAMAMURAShigenobu KOBAYASHI
Author information
JOURNAL FREE ACCESS

1992 Volume 28 Issue 1 Pages 125-134

Details
Abstract
We know the fact that multi-layered neural-network seem to perform some kind of information processing through the pattern-translation, but have not known the actual form of processing up to date. This brings us the difficulty to evaluate objectively the acquired structure and the effect of the learning. In this paper, we regard the internal process of the neural network as extracting the logical relations from the input patterns. First, we propose a generalized multi-layered neural-network which includes the higher-order connections. After formulating the neural network in a general form, the extended Back-Propagation learning algorithm is derived. Secondly, we present a theorem that the network model can realize all of the logical function within any value of mean squared error. We confirm the meanings of the theorem through the computer simulation about the 4-bits odd parity problem. Finally, we show the result of application to some typical examples. The simulation of XOR problem, which is one of the most general methods to evaluate the ability of a network, shows that if the network has the suitable structure to acquire the conjunctive relation of the input pattern, we can accelerate the convergence of learning. The simulation of the Encoder problem show that if the structure of the network is not suitable for the problem, the effect of acceleration is not appeared. Through these experimentations, we can make sure that the proposed model can make the internal function clear and improve the network performance remarkably.
Content from these authors
© The Society of Instrument and Control Engineers (SICE)
Previous article Next article
feedback
Top