2000 character limit reached
A Principle for Global Optimization with Gradients (2308.09556v1)
Published 18 Aug 2023 in math.OC, cs.NA, cs.NE, math.NA, and stat.ML
Abstract: This work demonstrates the utility of gradients for the global optimization of certain differentiable functions with many suboptimal local minima. To this end, a principle for generating search directions from non-local quadratic approximants based on gradients of the objective function is analyzed. Experiments measure the quality of non-local search directions as well as the performance of a proposed simplistic algorithm, of the covariance matrix adaptation evolution strategy (CMA-ES), and of a randomly reinitialized Broyden-Fletcher-Goldfarb-Shanno (BFGS) method.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.