A neural network for function approximation is treated theoretically. The structure and the information processing of this network is similar to those of the forward-only counterpropagation network, while the learning rule is improved. That is, the learning rule of the hidden layer is self-organizing map instead of winner-take-all. For the case that a target function and the network are with one input and one output, parameters of the learning rule are lead theoretically to obtain the function approximation with the least square error, and the theoretical results are verified by computer simulations.
The correlation coding and the adaptation of propagation and synaptic delays on pulse neural networks are discussed. It is shown that the structure of temporal correlation due to external signals is represented with the delays on a method of delay adaptation. This adaptive networks can separate correlation structures in a pulse train on which several trains are superimposed. By introducing the topological mapping, correlation structures of two external chaotic systems are reconstructed on the networks, furthermore, both of them can be extracted by the correlation coding.
One of the most important features of 3-layered neural networks is the adaptability of the basis functions. In this paper, in order to focus on the adaptability in a context of the regression or curve-fitting, we restricted our attention to function representation in which the basis functions are modified according to the associated discrete parameters. For such function representation, we derived the expectations of the least square error and prediction square error with respect to the distribution of a set of samples using the extreme value theory, provided that the given set of samples is an independent Gaussian noise sequence and the basis functions satisfy an appropriate orthonormality condition.