Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Asymptotic proximal point methods: finding the global minima with linear convergence for a class of multiple minima problems (2004.02210v4)

Published 5 Apr 2020 in math.OC, cs.NA, and math.NA

Abstract: We propose and analyze asymptotic proximal point (APP) methods to find the global minimizer for a class of nonconvex, nonsmooth, or even discontinuous multiple minima functions. The method is based on an asymptotic representation of nonconvex proximal points so that it can find the global minimizer without being trapped in saddle points, local minima, or even discontinuities. Our main result shows that the method enjoys the global linear convergence for such a class of functions. Furthermore, the method is derivative-free and its per-iteration cost, i.e., the number of function evaluations, is also bounded, so it has a complexity bound $\mathcal{O}(\log\frac{1}{\epsilon})$ for finding a point such that the gap between this point and the global minimizer is less than $\epsilon>0$. Numerical experiments and comparisons in various dimensions from $2$ to $500$ demonstrate the benefits of the method.

Summary

We haven't generated a summary for this paper yet.