Papers
Topics
Authors
Recent
Search
2000 character limit reached

Data sparse multilevel covariance estimation in optimal complexity

Published 27 Jan 2023 in math.NA and cs.NA | (2301.11992v1)

Abstract: We consider the $\mathcal{H}2$-formatted compression and computational estimation of covariance functions on a compact set in $\mathbb{R}d$. The classical sample covariance or Monte Carlo estimator is prohibitively expensive for many practically relevant problems, where often approximation spaces with many degrees of freedom and many samples for the estimator are needed. In this article, we propose and analyze a data sparse multilevel sample covariance estimator, i.e., a multilevel Monte Carlo estimator. For this purpose, we generalize the notion of asymptotically smooth kernel functions to a Gevrey type class of kernels for which we derive new variable-order $\mathcal{H}2$-approximation rates. These variable-order $\mathcal{H}2$-approximations can be considered as a variant of $hp$-approximations. Our multilevel sample covariance estimator then uses an approximate multilevel hierarchy of variable-order $\mathcal{H}2$-approximations to compress the sample covariances on each level. The non-nestedness of the different levels makes the reduction to the final estimator nontrivial and we present a suitable algorithm which can handle this task in linear complexity. This allows for a data sparse multilevel estimator of Gevrey covariance kernel functions in the best possible complexity for Monte Carlo type multilevel estimators, which is quadratic. Numerical examples which estimate covariance matrices with tens of billions of entries are presented.

Citations (2)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.