Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
133 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Mixed Finite Differences Scheme for Gradient Approximation (2105.09606v1)

Published 20 May 2021 in math.OC

Abstract: In this paper we focus on the linear functionals defining an approximate version of the gradient of a function. These functionals are often used when dealing with optimization problems where the computation of the gradient of the objective function is costly or the objective function values are affected by some noise. These functionals have been considered to estimate the gradient of the objective function by the expected value of the function variations in the space of directions. The expected value is then approximated by a sample average over a proper (random) choice of sample directions in the domain of integration. In this way the approximation error is characterized by statistical properties of the sample average estimate, typically its variance. This work instead is aimed at deriving a new approximation scheme, where linear functionals are no longer considered as expected values over the space of directions, but rather as the filtered derivative of the objective function by a gaussian kernel. By using this new approach, a gradient estimation based on a suitable linear combination of central finite differences at different step sizes is proposed, allowing to characterize the error of approximation in a deterministic way. Numerical experiments on a set of test functions are encouraging, showing good performances compared to those of the methods commonly used in the literature, both in the noisy and in the non-noisy setting.

Summary

We haven't generated a summary for this paper yet.