Abstract
In this paper, we propose a new network element model named a “weighted-sum-based sine element” for neural networks. We also derive a learning algorithm based on the back-propagation algorithms for multilayer networks. The weighted-sum-based sine element receives an inner product between an input pattern vector and its weight vector as its input value, and uses an affine transformation of sine function as its output function. The proposed “weighted-sum-based sine network” is capable of improving the learning speed as well as the convergence rate because its output function does not have any saturated regions which cause slow learning speed of the back-propagation learning using standard sigmoid elements. The sinuous shape of the output function is capable of learning more complex mapping. We demonstrate the advantages of the proposed network by solving N-bits parity problems, a function approximation problem and the two-spirals problem. Experimental results indicate that our weighted-sum-based sine network consistently obtains better results than the conventional sigmoid network in terms of both the learning speed and the convergence rate.