Low-Rank Approximation by Randomly Pivoted LU
Abstract: The low-rank approximation properties of Randomly Pivoted LU (RPLU), a variant of Gaussian elimination where pivots are sampled proportional to the squared entries of the Schur complement, are analyzed. It is shown that the RPLU iterates converge geometrically in expectation for matrices with rapidly decaying singular values. RPLU outperforms existing low-rank approximation algorithms in two settings: first, when memory is limited, RPLU can be implemented with $\mathcal{O}(k2 + m + n)$ storage and $\mathcal{O}( k(m + n)+ k\mathcal{M}(\mat{A}) + k3)$ operations, where $\mathcal{M}(\mat{A})$ is the cost of a matvec with $\mat{A}\in\mathbb{C}{n\times m}$ or its adjoint, for a rank-$k$ approximation. Second, when the matrix and its Schur complements share exploitable structure, such as for Cauchy-like matrices. The efficacy of RPLU is illustrated with several examples, including applications in rational approximation and solving large linear systems on GPUs.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.