Half-Quadratic Regularization
- Half-quadratic regularization is an optimization technique that reformulates non-convex regularization problems using implicit concave functions and auxiliary variables.
- It transforms complex signal and image reconstruction tasks into convex or quadratic subproblems, enabling efficient block coordinate descent algorithms.
- This approach preserves important features such as edges and sparsity, making it highly effective for applications like denoising and deblurring.
Half-quadratic regularization is an optimization technique that alleviates the difficulties posed by non-convex or non-smooth regularizers, especially in signal and image reconstruction. The method achieves this by reformulating the original objective into an augmented problem through auxiliary variable introduction and functional transformations, often exploiting concave structures. This results in subproblems that are convex or quadratic with respect to one block of variables, making the overall problem amenable to efficient block coordinate descent or alternating minimization algorithms.
1. Mathematical Foundations of Half-Quadratic Regularization
A half-quadratic regularization problem often arises when reconstructing a signal from measurements , modeled as: where the regularizer is typically non-smooth and/or non-convex, designed to preserve edges or sparsity.
A central construct is the use of implicit concave functions, defined as compositions where is strictly concave and differentiable, and is a continuously differentiable mapping, commonly in edge-preserving regularization scenarios (Latorre, 7 Oct 2025). Many popular regularizers in image processing can be written in this form.
Applying the Fenchel conjugate of , one has: for and auxiliary variable , suggesting the equivalent augmented function: The minimization of over is thus transformed into a minimization over both and of .
2. Augmented Problem Structure and Variable Splitting
The introduction of auxiliary variables leads to an augmented optimization problem with at least biconvex structure: which is convex in when is fixed (assuming is quadratic), and convex in when is fixed (due to the concavity of ).
In edge-preserving image regularization, such as denoising or deblurring, the augmented half-quadratic regularization can be explicitly written as: where is the system matrix, observed data, spatial gradient operators, and the auxiliary variables per pixel or edge (Latorre, 7 Oct 2025).
3. Optimization Algorithms and Block Coordinate Descent
Owing to its biconvexity, the augmented half-quadratic regularization problem is well suited to block coordinate descent algorithms:
- -subproblem: Minimize with respect to for fixed . This is quadratic, resulting in efficient solutions via linear solvers or conjugate gradient methods.
- -subproblem: Minimize over for fixed . This is convex due to properties of the Fenchel conjugate.
Each subproblem can typically be solved efficiently, enabling iterative schemes that alternate between the two blocks. Under suitable conditions, global convergence can be established, and stationary points of the augmented problem correspond one-to-one to those of the original (Latorre, 7 Oct 2025).
4. Applications in Signal and Image Reconstruction
Half-quadratic regularization is extensively used in signal and image processing tasks where edge-preserving regularizers (such as Geman–Reynolds, Huber, or -norms) are preferred. Notably, many of these can be written as functions , where is strictly concave.
The augmented form enables:
- Efficient edge preservation and noise reduction in images
- Solving otherwise non-convex, non-smooth regularization problems via quadratic subproblems
- Adaptation to a variety of regularizers, since Table 1 in (Latorre, 7 Oct 2025) explicitly lists most edge-preserving potentials as implicitly concave.
5. Theoretical Equivalence and Practical Implications
A proven result is the equivalence of stationary points and (under second-order conditions) local minima between the original objective and the augmented half-quadratic formulation. In particular, for , if is a stationary point, then the pair , where , is a stationary point of , and vice versa (Latorre, 7 Oct 2025).
This structural result ensures:
- No spurious solutions are introduced by auxiliary variable splitting
- The block coordinate (or nonlinear Gauss-Seidel) scheme converges to meaningful solutions of the original problem.
6. Impact on Non-Convex Optimization and Extensions
The half-quadratic approach via implicit concave functions provides both a theoretical and practical framework for a wide class of non-convex signal recovery and image processing problems. Notable features include:
- Biconvexity of the augmented problem, often yielding globally bounded from below objectives
- Efficient numerical algorithms, as each subproblem is convex or quadratic
- Generalization to other structured non-convex problems through proper mapping and potential
A plausible implication is that such reformulations may facilitate scalable solvers in modern large-scale imaging and machine learning tasks where structured, non-convex penalties become unavoidable.
7. Representative Edge-Preserving Regularizers and Augmented Formulations
Many standard edge-preserving regularizers can be formulated as implicit concave functions: | Regularizer | Form | Associated Fenchel Conjugate | |-------------------------------|---------------------------|----------------------------------------------| | Huber, Geman–Reynolds, | , etc. | See Table 1 in (Latorre, 7 Oct 2025) |
These formulations validate that their augmented problems are biconvex and bounded from below.
In summary, half-quadratic regularization—grounded in implicit concave functions and Fenchel conjugate theory—transforms challenging non-convex signal reconstruction problems into augmented forms that admit efficient, theoretically sound block coordinate descent algorithms. This framework has demonstrable utility for a broad class of edge-preserving regularization tasks and is foundational for ongoing developments in structured non-convex optimization (Latorre, 7 Oct 2025).