Papers
Topics
Authors
Recent
Search
2000 character limit reached

Half-Thresholding Rule in Sparse Recovery

Updated 23 February 2026
  • Half-thresholding is a nonlinear, closed-form operator that balances bias and stability, serving as a key tool in ℓ1/2 regularization for sparse recovery.
  • It underpins iterative schemes like adaptively iterative thresholding and proximal gradient methods to efficiently detect support and converge in underdetermined systems.
  • Compared to hard and soft thresholding, it offers an intermediate trade-off by aggressively promoting sparsity while maintaining robustness in high-dimensional applications.

The half-thresholding rule is a nonlinear scalar mapping and associated iterative thresholding scheme, fundamental to sparse regularization via nonconvex 1/2\ell_{1/2}-type penalties. It admits a closed-form, non-monotone thresholding operator, and is central to both adaptively iterative thresholding in underdetermined systems and fixed-parameter 1/2\ell_{1/2}-regularized optimization. The half-thresholding operator is distinguished by its discontinuity at the threshold and its strong sparsity-promoting properties, offering a balance between the bias of soft-thresholding and the instability of hard-thresholding. Rigorous convergence guarantees, complexity estimates, and comparative analyses support its advantages in high-dimensional sparse recovery applications (Zeng et al., 2013, Zeng et al., 2013, Zhang et al., 2014).

1. The Half-Thresholding Operator: Definition and Construction

The half-thresholding operator, denoted here as hτ,1/2:RRh_{\tau,1/2} : \mathbb{R} \rightarrow \mathbb{R}, is defined for a scalar uu and threshold τ>0\tau > 0 by

hτ,1/2(u)={23u(1+cos(2π323arccos(22(τu)3/2)))u>τ 0uτh_{\tau,1/2}(u) = \begin{cases} \displaystyle \frac{2}{3} u \left(1 + \cos\left(\frac{2\pi}{3} - \frac{2}{3} \arccos\left( \frac{\sqrt{2}}{2} \left(\frac{\tau}{|u|}\right)^{3/2} \right) \right) \right) & |u| > \tau \ 0 & |u| \le \tau \end{cases}

as developed for adaptively iterative thresholding (AIT) (Zeng et al., 2013) and equivalently for proximal solutions to an 1/2\ell_{1/2}-regularized quadratic subproblem (Zeng et al., 2013, Zhang et al., 2014). Applied to vectors vRNv\in\mathbb{R}^N, the operator acts componentwise.

Key properties:

  • hτ,1/2h_{\tau,1/2} is odd, strictly nondecreasing on [0,)[0, \infty), and admits explicit lower and upper bounds: uτ3hτ,1/2(u)uu - \frac{\tau}{3} \le h_{\tau,1/2}(u) \le u for all uτu \ge \tau.
  • The map is discontinuous at the threshold u=τ|u|=\tau, where the operator jumps from $0$ to a strictly positive value, similar to hard-thresholding but unlike the continuous soft-thresholding transition (Zhang et al., 2014).
  • The closed-form arises from solving a depressed cubic in the subproblem minz12(zu)2+τz1/2\min_z \frac12 (z-u)^2 + \tau |z|^{1/2} using Cardano’s formula.

Alternative parameterization appears in the fixed-step iterative thresholding setting for 1/2\ell_{1/2} regularization, where τ=λμ\tau = \lambda \mu, with λ\lambda the regularization parameter and μ\mu the step size (Zeng et al., 2013).

2. Application in Iterative Schemes: AIT and Proximal Algorithms

The half-thresholding rule underpins two principal classes of algorithms:

(A) Adaptively Iterative Thresholding (AIT): For an underdetermined linear system y=Axy = A x, where ARM×NA \in \mathbb{R}^{M \times N}, the AIT with half-thresholding seeks a kk-sparse solution by:

  1. Initializing x(0)=0x^{(0)} = 0.
  2. Iteratively: (a) Compute z(t+1)=x(t)+AT(yAx(t))z^{(t+1)} = x^{(t)} + A^T(y - A x^{(t)}) (Landweber step). (b) Set τ(t+1)\tau^{(t+1)} to the (k+1)(k+1)-st largest entry in z(t+1)|z^{(t+1)}|. (c) Update x(t+1)=Hτ(t+1)(1/2)(z(t+1))x^{(t+1)} = H^{(1/2)}_{\tau^{(t+1)}}(z^{(t+1)}). (d) Stop when a chosen criterion is met (Zeng et al., 2013).

(B) Proximal Gradient Method for 1/2\ell_{1/2} Regularization: For

minxRN12Axy22+λi=1Nxi1/2\min_{x \in \mathbb{R}^N} \frac12 \|A x - y\|_2^2 + \lambda \sum_{i=1}^N |x_i|^{1/2}

the iterative scheme applies: xk+1=Tλ,μ(xkμAT(Axky))x^{k+1} = T_{\lambda, \mu}\left( x^k - \mu A^T (A x^k - y) \right) with Tλ,μT_{\lambda, \mu} the closed-form half-thresholding operator, threshold τ=32(λμ)2/3\tau = \frac32 (\lambda \mu)^{2/3}, and 0<μ<A220 < \mu < \|A\|_2^{-2} (Zeng et al., 2013, Zhang et al., 2014).

Both methods achieve per-iteration complexity O(mN)O(m N), with scalar thresholding dominating the update after the two matrix-vector multiplications (Zeng et al., 2013, Zeng et al., 2013).

Algorithmic Setting Threshold Update Support Size Control Reference
AIT (sparse recovery) Adaptive by sparsity kk Exactly kk (Zeng et al., 2013)
1/2\ell_{1/2} Proximal Fixed or cross-validated Data-driven (Zeng et al., 2013)

3. Theoretical Guarantees and Convergence Analysis

Comprehensive convergence results are established under explicit measurement matrix coherence or restricted isometry assumptions:

  • AIT with half-thresholding:

The algorithm recovers the true support of xx^* in finitely many steps provided the coherence μ=maxijAi,Aj\mu = \max_{i \neq j} |\langle A_i, A_j \rangle| obeys μ<310k\mu < \frac{3}{10 k^*}, with kk^* the true sparsity (Zeng et al., 2013). Support identification is guaranteed in at most TkT^*_{k^*} steps (constant dependent on k,μk^*, \mu, and dynamic range). Once support is identified, the iterates x(t)x^{(t)} converge exponentially fast to xx^*:

x(t)x3+c2minisupp(x)xiρtt+1\| x^{(t)} - x^* \|_\infty \le \frac{3 + c}{2} \min_{i \in \mathrm{supp}(x^*)} |x_i^*| \rho^{t-t^*+1}

where c=1/3c=1/3, ρ=43kμ<1/2\rho = \frac{4}{3} k \mu < 1/2 (Zeng et al., 2013).

  • Iterative half-thresholding for 1/2\ell_{1/2} regularization:

Under 0<μ<A220 < \mu < \|A\|_2^{-2}, the sequence {xk}\{x^k\} converges to a stationary point xx^*. Local minimality is ensured for sufficiently small λ\lambda or well-conditioned AIA_I, and eventual linear convergence rate (xk+1x2pxkx2\|x^{k+1}-x^*\|_2 \leq p \|x^k-x^*\|_2, p<1p<1, for large kk) is achieved (Zeng et al., 2013).

  • Continuity at the threshold: Half-thresholding is discontinuous at the threshold, inducing more aggressive sparsity than the continuous soft-thresholding map of the 1\ell_1 case (Zhang et al., 2014).

4. Comparison with Hard and Soft Thresholding Schemes

Half-thresholding occupies an intermediate position between hard (0\ell_0) and soft (1\ell_1) thresholding:

  • Coherence constraints:
    • Hard: μ<1/(3k)\mu < 1/(3k^*)
    • Half: μ<3/(10k)1/(3.33k)\mu < 3/(10k^*) \approx 1/(3.33k^*)
    • Soft: μ<1/(4k)\mu < 1/(4k^*)
    • The half-thresholding rule requires a slightly more restrictive coherence bound than hard-thresholding, but is less restrictive than soft (Zeng et al., 2013).
  • Practical implications:
    • Hard-thresholding is unbiased but becomes unstable near the coherence limit.
    • Soft-thresholding introduces bias for large coefficients but is robust.
    • Half-thresholding achieves a tradeoff, with reduced bias compared to soft and enhanced stability compared to hard.
  • Empirical iteration counts (example: k=9k^* = 9, μ=1/40\mu = 1/40, dynamic range 10):
    • Hard: \sim20 iterations
    • Half: \sim25 iterations
    • Soft: \sim42 iterations
    • This demonstrates that half-thresholding achieves intermediate support detection speed and iterative complexity (Zeng et al., 2013).
Method Coherence Bound Empirical Iterations Feature at Threshold
Hard 1/(3k)1/(3k^*) \sim20 Discontinuous, unbiased
Half 3/(10k)3/(10k^*) \sim25 Discontinuous, nonconvex
Soft 1/(4k)1/(4k^*) \sim42 Continuous, biased

5. Parameter Selection, Implementation, and Computational Complexity

Parameter selection is scenario-dependent:

  • AIT: The threshold is adaptively set to the (k+1)(k+1)-st largest magnitude, enforcing exact kk-sparsity for each iterate (Zeng et al., 2013).
  • Proximal half-thresholding: The threshold τ\tau equals λμ\lambda\mu (with corresponding critical t1/2τ2/3t_{1/2} \sim \tau^{2/3}), and λ\lambda can be set by sparsity, cross-validation, or inspecting the sorted entries of the pre-thresholded iterate (Zhang et al., 2014).

Implementation is efficient:

  • Each iteration requires two matrix-vector multiplications and O(N)O(N) scalar thresholdings.
  • Per-iteration complexity: O(mN)O(mN), advantageous over IRLS and IRL1 (O(mN2)O(m N^2) due to matrix inversion/LP solves) for large NN (Zeng et al., 2013).

The proximity operator is not continuous at threshold, and higher selectivity of sparse components is achieved compared to 1\ell_1, particularly relevant for high dynamic-range signals or measurement matrices with moderate coherence (Zhang et al., 2014).

6. Numerical Performance and Empirical Comparisons

Extensive simulation studies confirm:

  • For small NN, IRLS can be marginally faster due to efficient small-scale least squares (Zeng et al., 2013).
  • For larger NN, half-thresholding substantially outperforms IRLS and IRL1.
  • Recovery accuracy (MSE) of half-thresholding matches or improves upon traditional alternatives, providing high-precision sparse signal reconstruction (Zeng et al., 2013).

The discontinuous threshold mechanism imbues half-thresholding with robustness to noise and measurement coherence, consistently achieving support recovery with fewer iterations than soft-thresholding and greater stability than hard-thresholding (Zeng et al., 2013, Zhang et al., 2014).

7. Extensions and Connections

The closed-form half-thresholding operator exists due to the solvability of the scalar cubic arising from the 1/2\ell_{1/2} regularizer; analogous formulas in the $0 < p < 1$ setting only appear for p=1/2p = 1/2 and p=2/3p = 2/3 (Zhang et al., 2014). Generalizations to transformed 1\ell_1 (TL1) penalties have been developed, retaining robust sparsity promotion with explicit thresholding maps, and exhibit performance advantages in compressed sensing beyond half-thresholding (Zhang et al., 2014). The nonconvex nature of half-thresholding enhances sparsity beyond convex 1\ell_1, motivating its adoption in fields including signal processing, statistical estimation, and high-dimensional machine learning.

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Half-Thresholding Rule.