The Brain & Neural Networks
Online ISSN : 1883-0455
Print ISSN : 1340-766X
ISSN-L : 1340-766X
Volume 4, Issue 1
Displaying 1-9 of 9 articles from this issue
  • Yasuhiro Sagawa, Koji Kurata
    1997 Volume 4 Issue 1 Pages 3-9
    Published: March 05, 1997
    Released on J-STAGE: December 13, 2010
    JOURNAL FREE ACCESS
    A neural network for function approximation is treated theoretically. The structure and the information processing of this network is similar to those of the forward-only counterpropagation network, while the learning rule is improved. That is, the learning rule of the hidden layer is self-organizing map instead of winner-take-all. For the case that a target function and the network are with one input and one output, parameters of the learning rule are lead theoretically to obtain the function approximation with the least square error, and the theoretical results are verified by computer simulations.
    Download PDF (608K)
  • Natsuhiro Ichinose, Kazuyuki Aihara, Yoichi Okabe
    1997 Volume 4 Issue 1 Pages 10-17
    Published: March 05, 1997
    Released on J-STAGE: December 13, 2010
    JOURNAL FREE ACCESS
    The correlation coding and the adaptation of propagation and synaptic delays on pulse neural networks are discussed. It is shown that the structure of temporal correlation due to external signals is represented with the delays on a method of delay adaptation. This adaptive networks can separate correlation structures in a pulse train on which several trains are superimposed. By introducing the topological mapping, correlation structures of two external chaotic systems are reconstructed on the networks, furthermore, both of them can be extracted by the correlation coding.
    Download PDF (727K)
  • Taichi Hayasaka, Katsuyuki Hagiwara, Naohiro Toda, Shiro Usui
    1997 Volume 4 Issue 1 Pages 18-26
    Published: March 05, 1997
    Released on J-STAGE: December 13, 2010
    JOURNAL FREE ACCESS
    One of the most important features of 3-layered neural networks is the adaptability of the basis functions. In this paper, in order to focus on the adaptability in a context of the regression or curve-fitting, we restricted our attention to function representation in which the basis functions are modified according to the associated discrete parameters. For such function representation, we derived the expectations of the least square error and prediction square error with respect to the distribution of a set of samples using the extreme value theory, provided that the given set of samples is an independent Gaussian noise sequence and the basis functions satisfy an appropriate orthonormality condition.
    Download PDF (769K)
feedback
Top