Papers
Topics
Authors
Recent
Search
2000 character limit reached

Riemannian Inexact Gradient Descent for Quadratic Discrimination

Published 7 Jul 2025 in math.OC | (2507.04670v1)

Abstract: We propose an inexact optimization algorithm on Riemannian manifolds, motivated by quadratic discrimination tasks in high-dimensional, low-sample-size (HDLSS) imaging settings. In such applications, gradient evaluations are often biased due to limited sample sizes. To address this, we introduce a novel Riemannian optimization algorithm that is robust to inexact gradient information and prove an $\mathcal O(1/K)$ convergence rate under standard assumptions. We also present a line search variant that requires access to function values but not exact gradients, maintaining the same convergence rate and ensuring sufficient descent. The algorithm is tailored to the Grassmann manifold by leveraging its geometric structure, and its convergence rate is validated numerically. A simulation of heteroscedastic images shows that when bias is introduced into the problem, both intentionally and through estimation of the covariance matrix, the detection performance of the algorithm solution is comparable to when true gradients are used in the optimization. The optimal subspace learned via the algorithm encodes interpretable patterns and shows qualitative similarity to known optimal solutions. By ensuring robust convergence and interpretability, our algorithm offers a compelling tool for manifold-based dimensionality reduction and discrimination in high-dimensional image data settings.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.