An efficient proximal algorithm for squared L1 over L2 regularized sparse recovery
Abstract: In this paper, we consider a squared $L_1/L_2$ regularized model for sparse signal recovery from noisy measurements. We first establish the existence of optimal solutions to the model under mild conditions. Next, we propose a proximal method for solving a general fractional optimization problem which has the squared $L_1/L_2$ regularized model as a special case. We prove that any accumulation point of the solution sequence generated by the proposed method is a critical point of the fractional optimization problem. Under additional KL assumptions on some potential function, we establish the sequential convergence of the proposed method. When this method is specialized to the squared $L_1/L_2$ regularized model, the proximal operator involved in each iteration admits a simple closed form solution that can be computed with very low computational cost. Furthermore, for each of the three concrete models, the solution sequence generated by this specialized algorithm converges to a critical point. Numerical experiments demonstrate the superiority of the proposed algorithm for sparse recovery based on squared $L_1/L_2$ regularization.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.