Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
126 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On the analysis of optimization with fixed-rank matrices: a quotient geometric view (2203.06765v2)

Published 13 Mar 2022 in math.OC, cs.NA, and math.NA

Abstract: We study a type of Riemannian gradient descent (RGD) algorithm, designed through Riemannian preconditioning, for optimization on $\mathcal{M}k{m\times n}$ -- the set of $m\times n$ real matrices with a fixed rank $k$. Our analysis is based on a quotient geometric view of $\mathcal{M}_k{m\times n}$: by identifying this set with the quotient manifold of a two-term product space $\mathbb{R}{m\times k}\times \mathbb{R}_{n\times k}$ of matrices with full column rank via matrix factorization, we find an explicit form for the update rule of the RGD algorithm, which leads to a novel approach to analysing their convergence behavior in rank-constrained optimization. We then deduce some interesting properties that reflect how RGD distinguishes from other matrix factorization algorithms such as those based on the Euclidean geometry. In particular, we show that the RGD algorithm is not only faster than Euclidean gradient descent but also does not rely on balancing techniques to ensure its efficiency while the latter does. We further show that this RGD algorithm is guaranteed to solve matrix sensing and matrix completion problems with linear convergence rate under the restricted positive definiteness property. Numerical experiments on matrix sensing and completion are provided to demonstrate these properties.

Citations (3)

Summary

We haven't generated a summary for this paper yet.