Preconditioning without a preconditioner: faster ridge-regression and Gaussian sampling with randomized block Krylov subspace methods
Abstract: We describe a randomized variant of the block conjugate gradient method for solving a single positive-definite linear system of equations. Our method provably outperforms preconditioned conjugate gradient with a broad-class of Nystr\"om-based preconditioners, without ever explicitly constructing a preconditioner. In analyzing our algorithm, we derive theoretical guarantees for new variants of Nystr\"om preconditioned conjugate gradient which may be of separate interest. We also describe how our approach yields state-of-the-art algorithms for key data-science tasks such as computing the entire ridge regression regularization path and generating multiple independent samples from a high-dimensional Gaussian distribution.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.