Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Randomized Learning of the Second-Moment Matrix of a Smooth Function (1612.06339v6)

Published 19 Dec 2016 in cs.IT and math.IT

Abstract: Consider an open set $\mathbb{D}\subseteq\mathbb{R}n$, equipped with a probability measure $\mu$. An important characteristic of a smooth function $f:\mathbb{D}\rightarrow\mathbb{R}$ is its \emph{second-moment matrix} $\Sigma_{\mu}:=\int \nabla f(x) \nabla f(x)* \mu(dx) \in\mathbb{R}{n\times n}$, where $\nabla f(x)\in\mathbb{R}n$ is the gradient of $f(\cdot)$ at $x\in\mathbb{D}$ and $*$ stands for transpose. For instance, the span of the leading $r$ eigenvectors of $\Sigma_{\mu}$ forms an \emph{active subspace} of $f(\cdot)$, which contains the directions along which $f(\cdot)$ changes the most and is of particular interest in \emph{ridge approximation}. In this work, we propose a simple algorithm for estimating $\Sigma_{\mu}$ from random point evaluations of $f(\cdot)$ \emph{without} imposing any structural assumptions on $\Sigma_{\mu}$. Theoretical guarantees for this algorithm are established with the aid of the same technical tools that have proved valuable in the context of covariance matrix estimation from partial measurements.

Citations (3)

Summary

We haven't generated a summary for this paper yet.