Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Data sparse multilevel covariance estimation in optimal complexity (2301.11992v1)

Published 27 Jan 2023 in math.NA and cs.NA

Abstract: We consider the $\mathcal{H}2$-formatted compression and computational estimation of covariance functions on a compact set in $\mathbb{R}d$. The classical sample covariance or Monte Carlo estimator is prohibitively expensive for many practically relevant problems, where often approximation spaces with many degrees of freedom and many samples for the estimator are needed. In this article, we propose and analyze a data sparse multilevel sample covariance estimator, i.e., a multilevel Monte Carlo estimator. For this purpose, we generalize the notion of asymptotically smooth kernel functions to a Gevrey type class of kernels for which we derive new variable-order $\mathcal{H}2$-approximation rates. These variable-order $\mathcal{H}2$-approximations can be considered as a variant of $hp$-approximations. Our multilevel sample covariance estimator then uses an approximate multilevel hierarchy of variable-order $\mathcal{H}2$-approximations to compress the sample covariances on each level. The non-nestedness of the different levels makes the reduction to the final estimator nontrivial and we present a suitable algorithm which can handle this task in linear complexity. This allows for a data sparse multilevel estimator of Gevrey covariance kernel functions in the best possible complexity for Monte Carlo type multilevel estimators, which is quadratic. Numerical examples which estimate covariance matrices with tens of billions of entries are presented.

Citations (2)

Summary

We haven't generated a summary for this paper yet.