Host: The Japanese Society for Artificial Intelligence
Name : The 35th Annual Conference of the Japanese Society for Artificial Intelligence
Number : 35
Location : [in Japanese]
Date : June 08, 2021 - June 11, 2021
In this study, we provides a convergence rate of a continuous black-box optimization algorithm, the (1+1)- Evolution Strategy (ES), on a general convex quadratic function, where convergence rate is decrease rate of the distance to the optimal point in each iteration. We show an upper bound of the convergence rate is described with the ratio of the smallest eigenvalue of the Hessian matrix to the sum of all eigenvalues. As long as the authors know, this is the first study which suggests the convergence rate of the (1+1)-ES on a general convex quadratic function is affected not only by the condition number of the Hessian, but also the distribution of the eigenvalues. Furthermore, we show a lower bound of the convergence rate on the same function class is described with the inverse of the dimension of the search space, which agrees with previous studies on a part of convex quadratic function.