Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
134 tokens/sec
GPT-4o
10 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Derivative-free global minimization for a class of multiple minima problems (2006.08181v2)

Published 15 Jun 2020 in math.OC, cs.CC, cs.NA, and math.NA

Abstract: We prove that the finite-difference based derivative-free descent (FD-DFD) methods have a capability to find the global minima for a class of multiple minima problems. Our main result shows that, for a class of multiple minima objectives that is extended from strongly convex functions with Lipschitz-continuous gradients, the iterates of FD-DFD converge to the global minimizer $x_$ with the linear convergence $|x_{k+1}-x_|22\leqslant\rhok |x_1-x|22$ for a fixed $0<\rho<1$ and any initial iteration $x_1\in\mathbb{R}d$ when the parameters are properly selected. Since the per-iteration cost, i.e., the number of function evaluations, is fixed and almost independent of the dimension $d$, the FD-DFD algorithm has a complexity bound $\mathcal{O}(\log\frac{1}{\epsilon})$ for finding a point $x$ such that the optimality gap $|x-x|_22$ is less than $\epsilon>0$. Numerical experiments in various dimensions from $5$ to $500$ demonstrate the benefits of the FD-DFD method.

Citations (1)

Summary

We haven't generated a summary for this paper yet.