Abstract
The back propagation method for multi-layered neural network models is used for pattern recognition, prediction, etc. in many fields. However, gradient methods which are often used to identify the values of model parameters have the significant problem that they can derive only local optimal solutions. A method for improving local optimal solutions of constrained nonlinear programming problems, named "Modal Trimming Method", has been proposed by the authors. It has turned out that the method has a high possibility of deriving global optimal solutions by suboptimal ones for a wide range of problems. It has been shown that this feature is because the renewal of solutions based on the extended Newton-Raphson method creates a chaotic behavior of the solutions. In this paper, the modal trimming method is adopted to derive global optimal solutions by the back propagation method. This approach is applied to an example problem for pattern recognition, and its validity and effectiveness are clarified.