Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Randomized Cholesky QR factorizations (2210.09953v2)

Published 18 Oct 2022 in math.NA and cs.NA

Abstract: This article proposes and analyzes several variants of the randomized Cholesky QR factorization of a matrix $X$. Instead of computing the R factor from $XT X$, as is done by standard methods, we obtain it from a small, efficiently computable random sketch of $X$, thus saving computational cost and improving numerical stability. The proposed direct variant of the randomized Cholesky QR requires only half the flops and the same communication cost as the classical Cholesky QR. At the same time, it is more robust since it is guaranteed to be stable whenever the input matrix is numerically full-rank. The rank-revealing randomized Cholesky QR variant has the ability to sort out the linearly dependent columns of $X$, which allows to have an unconditional numerical stability and reduce the computational cost when $X$ is rank-deficient. We also depict a column-oriented randomized Cholesky QR that establishes the connection with the randomized Gram-Schmidt process, and a reduced variant that outputs a low-dimensional projection of the Q factor rather than the full factor and therefore yields drastic computational savings. It is shown that performing minor operations in higher precision in the proposed algorithms can allow stability with working unit roundoff independent of the dominant matrix dimension. This feature may be of particular interest for a QR factorization of tall-and-skinny matrices on low-precision architectures.

Citations (7)

Summary

We haven't generated a summary for this paper yet.