A new optimization method, which will globally optimize even a multimodal function, is proposed. Non-linear programming is an excellent way to optimize a convex function, but this is not the case for a multimodal function. A numerical differentiation scheme with variable increment sizes was applied to a BFGS method, which is a popular non linear programming method, in order to overcome the above-mentioned deficiency. A multimodal test function was designed to examine global convergence. In the optimization process, the increments, which were targe in the beginning, were reduced as the process approached the global minimum. It was found that the process works well even for a multimodal function when the initial size of an increment size is properly selected.