A Preconditioned Riemannian Gradient Descent Algorithm for Low-Rank Matrix Recovery (2305.02543v1)
Abstract: The low-rank matrix recovery problem often arises in various fields, including signal processing, machine learning, and imaging science. The Riemannian gradient descent (RGD) algorithm has proven to be an efficient algorithm for solving this problem. In this paper, we present a preconditioned Riemannian gradient descent (PRGD) for low-rank matrix recovery. The preconditioner, noted for its simplicity and computational efficiency, is constructed by weighting the (i,j)-th entry of the gradient matrix according to the norms of the i-th row and the j-th column. We establish the theoretical recovery guarantee for PRGD under the restricted isometry property assumption. Experimental results indicate that PRGD can accelerate RGD by up to tenfold in solving low-rank matrix recovery problems such as matrix completion.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Collections
Sign up for free to add this paper to one or more collections.