SAGE: A Set-based Adaptive Gradient Estimator (2508.19400v1)
Abstract: A new paradigm to estimate the gradient of a black-box scalar function is introduced, considering it as a member of a set of admissible gradients that are computed using existing function samples. Results on gradient estimate accuracy, derived from a multivariate Taylor series analysis, are used to express the set of admissible gradients through linear inequalities. An approach to refine this gradient estimate set to a desired precision is proposed as well, using an adaptive sampling approach. The resulting framework allows one to estimate gradients from data sets affected by noise with finite bounds, to provide the theoretical best attainable gradient estimate accuracy, and the optimal sampling distance from the point of interest to achieve the best refinement of the gradient set estimates. Using these results, a new algorithm is proposed, named Set-based Adaptive Gradient Estimator (SAGE), which features both sample efficiency and robustness to noise. The performance of SAGE are demonstrated by comparing it with commonly-used and latest gradient estimators from literature and practice, in the context of numerical optimization with a first-order method. The results of an extensive statistical test show that SAGE performs competitively when faced with noiseless data, and emerges as the best method when faced with high noise bounds where other gradient estimators result in large errors.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.