Papers
Topics
Authors
Recent
Search
2000 character limit reached

Derivative-free global minimization for a class of multiple minima problems

Published 15 Jun 2020 in math.OC, cs.CC, cs.NA, and math.NA | (2006.08181v2)

Abstract: We prove that the finite-difference based derivative-free descent (FD-DFD) methods have a capability to find the global minima for a class of multiple minima problems. Our main result shows that, for a class of multiple minima objectives that is extended from strongly convex functions with Lipschitz-continuous gradients, the iterates of FD-DFD converge to the global minimizer $x_$ with the linear convergence $|x_{k+1}-x_|22\leqslant\rhok |x_1-x|22$ for a fixed $0<\rho<1$ and any initial iteration $x_1\in\mathbb{R}d$ when the parameters are properly selected. Since the per-iteration cost, i.e., the number of function evaluations, is fixed and almost independent of the dimension $d$, the FD-DFD algorithm has a complexity bound $\mathcal{O}(\log\frac{1}{\epsilon})$ for finding a point $x$ such that the optimality gap $|x-x|_22$ is less than $\epsilon>0$. Numerical experiments in various dimensions from $5$ to $500$ demonstrate the benefits of the FD-DFD method.

Citations (1)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (3)

Collections

Sign up for free to add this paper to one or more collections.