Abstract
Stochastic methods have been proposed to derive global optimal solutions of nonlinear programming problems with any objective functions and constraints, and constraints have been treated indirectly by the penalty method. In this paper, a method of treating constraints directly is proposed to improve local optimal solutions of nonlinear programming problems, and is named modal trimming method. This method combines a gradient method for searching local optimal solutions, with an extended Newton-Raphson method based on the Moore-Penrose generalized inverse of a Jacobian matrix for searching initial feasible solutions used for the gradient method. To prevent traps into fathomed local optimal solutions, a strategy of transforming objective functions is adopted. Some features of the modal trimming method are investigated qualitatively, and the method is applied to small-scale optimization problems with strongly nonlinear objective functions and constraints to investigate its validity and effectiveness.