Worst case complexity bounds for linesearch-type derivative-free algorithms (2302.05274v2)
Abstract: This paper is devoted to the analysis of worst case complexity bounds for linesearch-type derivative-free algorithms for the minimization of general non-convex smooth functions. We prove that two linesearch-type algorithms enjoy the same complexity properties which have been proved for pattern and direct search algorithms. In particular, we consider two derivative-free algorithms based on two different linesearch techniques and manage to prove that the number of iterations and of function evaluations required to drive the norm of the gradient of the objective function below a given threshold $\epsilon$ is ${\cal O}(\epsilon{-2})$ in the worst case.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.