Efficient Calculation of Regular Simplex Gradients (1710.01427v2)
Abstract: Simplex gradients are an essential feature of many derivative free optimization algorithms, and can be employed, for example, as part of the process of defining a direction of search, or as part of a termination criterion. The calculation of a general simplex gradient in $\mathbb{R}n$ can be computationally expensive, and often requires an overhead operation count of $\mathcal{O}(n3)$ and in some algorithms a storage overhead of $\mathcal{O}(n2)$. In this work we demonstrate that the linear algebra overhead and storage costs can be reduced, both to $\mathcal{O}(n)$, when the simplex employed is regular and appropriately aligned. We also demonstrate that a second order gradient approximation can be obtained cheaply from a combination of two, first order (appropriately aligned) regular simplex gradients. Moreover, we show that, for an arbitrarily aligned regular simplex, the gradient can be computed in only $\mathcal{O}(n2)$ operations.