Proceedings of the Annual Conference of JSAI
Online ISSN : 2758-7347
38th (2024)
Session ID : 1F3-GS-1-04
Conference information

Convergence rate analysis of the (1+1)-Evolution Strategy on locally strongly convex functions with Lipschitz continuous gradient
*Daiki MORINAGAYouhei AKIMOTO
Author information
CONFERENCE PROCEEDINGS FREE ACCESS

Details
Abstract

Evolution Strategy (ES) is one of the promising class of algorithms for the Black-Box Optimization (BBO) in which an algorithm queries only the objective function value. Despite its practical success, theoretical analysis of continuous BBO algorithm is still underdeveloped. In this study, the convergence rates of the worst case and the best case of the (1+1)-ES are derived on $L$-strongly convex and $U$-Lipschitz smooth function and its monotone transformation. It is proved that the order of those rates is proportional to $1/d$, and in the worst case, the convergence rate is proportional to $L/U$. These results show that the convergence rate of the (1+1)-ES is competitive to those of other derivative-free optimization algorithms that exploit $U$ as a known constant.

Content from these authors
© 2024 The Japanese Society for Artificial Intelligence
Previous article Next article
feedback
Top