Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Robust Low-Rank Matrix Completion via a New Sparsity-Inducing Regularizer (2310.04762v1)

Published 7 Oct 2023 in eess.IV, cs.LG, and eess.SP

Abstract: This paper presents a novel loss function referred to as hybrid ordinary-Welsch (HOW) and a new sparsity-inducing regularizer associated with HOW. We theoretically show that the regularizer is quasiconvex and that the corresponding Moreau envelope is convex. Moreover, the closed-form solution to its Moreau envelope, namely, the proximity operator, is derived. Compared with nonconvex regularizers like the lp-norm with 0<p<1 that requires iterations to find the corresponding proximity operator, the developed regularizer has a closed-form proximity operator. We apply our regularizer to the robust matrix completion problem, and develop an efficient algorithm based on the alternating direction method of multipliers. The convergence of the suggested method is analyzed and we prove that any generated accumulation point is a stationary point. Finally, experimental results based on synthetic and real-world datasets demonstrate that our algorithm is superior to the state-of-the-art methods in terms of restoration performance.

Summary

We haven't generated a summary for this paper yet.