Papers
Topics
Authors
Recent
Search
2000 character limit reached

Convergence of a first-order consensus-based global optimization algorithm

Published 18 Oct 2019 in math.OC and math.DS | (1910.08239v1)

Abstract: Global optimization of a non-convex objective function often appears in large-scale machine-learning and artificial intelligence applications. Recently, consensus-based optimization (in short CBO) methods have been introduced as one of the gradient-free optimization methods. In this paper, we provide a convergence analysis for the first-order CBO method in \cite{C-J-L-Z}. Prior to the current work, the convergence study was carried out for CBO methods on corresponding mean-field limit, a Fokker-Planck equation, which does not imply the convergence of the CBO method {\it per se}. Based on the consensus estimate directly on the first-order CBO model, we provide a convergence analysis of the first-order CBO method \cite{C-J-L-Z} without resorting to the corresponding mean-field model. Our convergence analysis consists of two steps. In the first step, we show that the CBO model exhibits a global consensus time asymptotically for any initial data, and in the second step, we provide a sufficient condition on system parameters--which is dimension independent-- and initial data which guarantee that the converged consensus state lies in a small neighborhood of the global minimum almost surely.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (3)

Collections

Sign up for free to add this paper to one or more collections.