Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 81 tok/s
Gemini 2.5 Pro 57 tok/s Pro
GPT-5 Medium 31 tok/s Pro
GPT-5 High 23 tok/s Pro
GPT-4o 104 tok/s Pro
GPT OSS 120B 460 tok/s Pro
Kimi K2 216 tok/s Pro
2000 character limit reached

SAGE: A Set-based Adaptive Gradient Estimator (2508.19400v1)

Published 26 Aug 2025 in math.OC

Abstract: A new paradigm to estimate the gradient of a black-box scalar function is introduced, considering it as a member of a set of admissible gradients that are computed using existing function samples. Results on gradient estimate accuracy, derived from a multivariate Taylor series analysis, are used to express the set of admissible gradients through linear inequalities. An approach to refine this gradient estimate set to a desired precision is proposed as well, using an adaptive sampling approach. The resulting framework allows one to estimate gradients from data sets affected by noise with finite bounds, to provide the theoretical best attainable gradient estimate accuracy, and the optimal sampling distance from the point of interest to achieve the best refinement of the gradient set estimates. Using these results, a new algorithm is proposed, named Set-based Adaptive Gradient Estimator (SAGE), which features both sample efficiency and robustness to noise. The performance of SAGE are demonstrated by comparing it with commonly-used and latest gradient estimators from literature and practice, in the context of numerical optimization with a first-order method. The results of an extensive statistical test show that SAGE performs competitively when faced with noiseless data, and emerges as the best method when faced with high noise bounds where other gradient estimators result in large errors.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Ai Generate Text Spark Streamline Icon: https://streamlinehq.com

Paper Prompts

Sign up for free to create and run prompts on this paper using GPT-5.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

alphaXiv