JSAI2024

Presentation information

General Session

General Session » GS-1 Fundamental AI, theory

[1F3-GS-1] Fundamental AI, theory: algorithm:

Tue. May 28, 2024 1:00 PM - 2:40 PM Room F (Temporary room 4)

座長:勝木 孝行(IBM)

2:00 PM - 2:20 PM

[1F3-GS-1-04] Convergence rate analysis of the (1+1)-Evolution Strategy on locally strongly convex functions with Lipschitz continuous gradient

〇Daiki Morinaga1, Youhei Akimoto1,2 (1. University of Tsukuba, 2. RIKEN AIP)

Keywords:Black-box optimization, Evolutionary computation, Evolution Strategy, Convergence Rate

Evolution Strategy (ES) is one of the promising class of algorithms for the Black-Box Optimization (BBO) in which an algorithm queries only the objective function value.
Despite its practical success, theoretical analysis of continuous BBO algorithm is still underdeveloped.
In this study, the convergence rates of the worst case and the best case of the (1+1)-ES are derived on $L$-strongly convex and $U$-Lipschitz smooth function and its monotone transformation.
It is proved that the order of those rates is proportional to $1/d$, and in the worst case, the convergence rate is proportional to $L/U$.
These results show that the convergence rate of the (1+1)-ES is competitive to those of other derivative-free optimization algorithms that exploit $U$ as a known constant.

Authentication for paper PDF access

A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.

Password