Papers
Topics
Authors
Recent
2000 character limit reached

$\ell_{1}^{2}-η\ell_{2}^{2}$ regularization for sparse recovery

Published 13 Jun 2025 in math.OC | (2506.11372v1)

Abstract: This paper presents a regularization technique incorporating a non-convex and non-smooth term, $\ell_{1}{2}-\eta\ell_{2}{2}$, with parameters $0<\eta\leq 1$ designed to address ill-posed linear problems that yield sparse solutions. We explore the existence, stability, and convergence of the regularized solution, demonstrating that the $\ell_{1}{2}-\eta\ell_{2}{2}$ regularization is well-posed and results in sparse solutions. Under suitable source conditions, we establish a convergence rate of $\mathcal{O}\left(\delta\right)$ in the $\ell_{2}$-norm for both a priori and a posteriori parameter choice rules. Additionally, we propose and analyze a numerical algorithm based on a half-variation iterative strategy combined with the proximal gradient method. We prove convergence despite the regularization term being non-smooth and non-convex. The algorithm features a straightforward structure, facilitating implementation. Furthermore, we propose a projected gradient iterative strategy base on surrogate function approach to achieve faster solving. Experimentally, we demonstrate visible improvements of $\ell_{1}{2}-\eta\ell_{2}{2}$ over $\ell_{1}$, $\ell_{1}-\eta\ell_{2}$, and other nonconvex regularizations for compressive sensing and image deblurring problems. All the numerical results show the efficiency of our proposed approach.

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Paper to Video (Beta)

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.