Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
8 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Restricted Low-Rank Approximation via ADMM (1512.01748v1)

Published 6 Dec 2015 in cs.NA and cs.DS

Abstract: The matrix low-rank approximation problem with additional convex constraints can find many applications and has been extensively studied before. However, this problem is shown to be nonconvex and NP-hard; most of the existing solutions are heuristic and application-dependent. In this paper, we show that, other than tons of application in current literature, this problem can be used to recover a feasible solution for SDP relaxation. By some sophisticated tricks, it can be equivalently posed in an appropriate form for the Alternating Direction Method of Multipliers (ADMM) to solve. The two updates of ADMM include the basic matrix low-rank approximation and projection onto a convex set. Different from the general non-convex problems, the sub-problems in each step of ADMM can be solved exactly and efficiently in spite of their non-convexity. Moreover, the algorithm will converge exponentially under proper conditions. The simulation results confirm its superiority over existing solutions. We believe that the results in this paper provide a useful tool for this important problem and will help to extend the application of ADMM to the non-convex regime.

Citations (9)

Summary

We haven't generated a summary for this paper yet.