Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
133 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Gradient and Hessian approximations in Derivative Free Optimization (2001.08355v1)

Published 23 Jan 2020 in math.OC

Abstract: This work investigates finite differences and the use of interpolation models to obtain approximations to the first and second derivatives of a function. Here, it is shown that if a particular set of points is used in the interpolation model, then the solution to the associated linear system (i.e., approximations to the gradient and diagonal of the Hessian) can be obtained in $\mathcal{O}(n)$ computations, which is the same cost as finite differences, and is a saving over the $\mathcal{O}(n3)$ cost when solving a general unstructured linear system. Moreover, if the interpolation points are formed using a `regular minimal positive basis', then the error bound for the gradient approximation is the same as for a finite differences approximation. Numerical experiments are presented that show how the derivative estimates can be employed within an existing derivative free optimization algorithm, thus demonstrating one of the potential practical uses of these derivative approximations.

Summary

We haven't generated a summary for this paper yet.