抄録
A neural network is expected to have not only interpolating ability for non-training data but also noise-resisting ability which means that the outputs keep the desired signals against the perturbation from the standard input data. In order to obtain the interpolating ability, we formulated the optimization problem which requires minimization of the absolute values of synaptic weights under the satisfactory conditions that the output errors in response to the training input data are less than the permissible levels.
In this paper, the satisfaction principle is adopted to obtain the noise-resistibility also. That is, satisfactory conditions in the formulated problem require that output errors in response to arbitrary perturbation of inputs within a certain range are reduced below permissible levels. In order to solve the problem, “Relaxation Learning Method”is proposed. This learning starts by solving a relaxed problem constrained by a few satisfactory conditions corresponding to the specified input data with some noises, and the new satisfactory conditions for other kinds of input data are added to the set of the constraints iteratively. Lastly, the learning terminates when the necessary and sufficient number of satisfactory conditions to acquire the resistibility are generated.
The application to the simple pattern recognition problem demonstrates that the proposed learning method is effective for interpolating ability together with noise-resistibility.