Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
129 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Low-rank approximation in the Frobenius norm by column and row subset selection (1908.06059v1)

Published 16 Aug 2019 in math.NA, cs.DS, and cs.NA

Abstract: A CUR approximation of a matrix $A$ is a particular type of low-rank approximation $A \approx C U R$, where $C$ and $R$ consist of columns and rows of $A$, respectively. One way to obtain such an approximation is to apply column subset selection to $A$ and $AT$. In this work, we describe a numerically robust and much faster variant of the column subset selection algorithm proposed by Deshpande and Rademacher, which guarantees an error close to the best approximation error in the Frobenius norm. For cross approximation, in which $U$ is required to be the inverse of a submatrix of $A$ described by the intersection of $C$ and $R$, we obtain a new algorithm with an error bound that stays within a factor $k + 1$ of the best rank-$k$ approximation error in the Frobenius norm. To the best of our knowledge, this is the first deterministic polynomial-time algorithm for which this factor is bounded by a polynomial in $k$. Our derivation and analysis of the algorithm is based on derandomizing a recent existence result by Zamarashkin and Osinsky. To illustrate the versatility of our new column subset selection algorithm, an extension to low multilinear rank approximations of tensors is provided as well.

Citations (25)

Summary

We haven't generated a summary for this paper yet.