Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
194 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Mirror Natural Evolution Strategies (1910.11490v1)

Published 25 Oct 2019 in math.OC and cs.LG

Abstract: Evolution Strategies such as CMA-ES (covariance matrix adaptation evolution strategy) and NES (natural evolution strategy) have been widely used in machine learning applications, where an objective function is optimized without using its derivatives. However, the convergence behaviors of these algorithms have not been carefully studied. In particular, there is no rigorous analysis for the convergence of the estimated covariance matrix, and it is unclear how does the estimated covariance matrix help the converge of the algorithm. The relationship between Evolution Strategies and derivative free optimization algorithms is also not clear. In this paper, we propose a new algorithm closely related toNES, which we call MiNES (mirror descent natural evolution strategy), for which we can establish rigorous convergence results. We show that the estimated covariance matrix of MiNES converges to the inverse of Hessian matrix of the objective function with a sublinear convergence rate. Moreover, we show that some derivative free optimization algorithms are special cases of MiNES. Our empirical studies demonstrate that MiNES is a query-efficient optimization algorithm competitive to classical algorithms including NES and CMA-ES.

Citations (2)

Summary

We haven't generated a summary for this paper yet.