Deep Pseudo Contractive Denoisers
- The paper shows that pseudo contractiveness relaxes strict non-expansiveness, enabling provable convergence while preserving strong empirical denoising performance.
- It details spectral regularization and Jacobian penalty techniques to enforce contraction-like behavior during the training of deep denoisers.
- The method integrates effectively into iterative plug-and-play schemes, achieving competitive PSNR/MSSIM scores with minimal trade-offs.
Deep pseudo contractive denoisers are a class of deep neural-network-based denoising operators designed to satisfy relaxed contraction-like properties, thereby supporting provable and stable integration in iterative signal recovery methods while retaining strong empirical denoising power. The concept emerges at the intersection of plug-and-play (PnP) optimization, monotone operator theory, and modern deep learning, aiming to bridge the gap between robust theoretical guarantees and the empirical effectiveness characteristic of deep image restoration networks.
1. Theoretical Formulation and Operator Properties
A deep pseudo contractive denoiser is formally defined by its operator-level regularity. Given a denoising operator , pseudo contractiveness requires (for a constant )
for all in an appropriate Hilbert space. This condition is a relaxation of non-expansiveness, which would require simply , and of firm non-expansiveness and averagedness, both of which are classical regularity assumptions linking denoisers to proximal mappings.
Spectral analysis of the Jacobian of reveals that pseudo contractiveness is substantially weaker than (firm) non-expansiveness. The requirement, written in spectral terms, stipulates that all eigenvalues of the symmetric part satisfy rather than being confined to the unit disk. For strict pseudo contractiveness (), an equivalent operator decomposition exists: can be written as a convex combination of a non-expansive operator and the identity, i.e.
for some non-expansive .
2. Training Methodologies and Regularization
Enforcing pseudo contractiveness in deep denoiser training involves regularizing the Jacobian's spectrum or functionals of its symmetric part. A representative training objective is
where promotes the spectral constraint , or alternatively, enforces a bound on a holomorphic functional calculus such as , ensuring . These constraints can be approximated practically with power iteration and automatic differentiation.
This regularization avoids the performance degradation typical when enforcing non-expansiveness or contractivity too strictly. Experiments show pseudo contractive denoisers (e.g., PC-DRUNet, SPC-DRUNet) achieve competitive peak SNR, often with only a minor tradeoff relative to unconstrained denoisers, while gaining provable convergence properties in PnP inverse problem settings (Wei et al., 8 Feb 2024).
3. Integration into Iterative and Plug-and-Play Schemes
Deep pseudo contractive denoisers enable stable integration into complex iterative signal estimation and inverse-problem solvers. The underlying theoretical guarantee is that for a denoiser satisfying -strict pseudo contractiveness (), fixed-point iterations based on the Ishikawa process (a generalization of Krasnosel’skii–Mann) globally converge:
1 2 |
v^n = (1 - \beta_n)u^n + \beta_n (D_\beta(u^n) - \nabla G(u^n))
u^{n+1} = (1 - \alpha_n)u^n + \alpha_n (D_\beta(v^n) - \nabla G(v^n)) |
Conservativeness (i.e., the denoiser being the gradient of a scalar potential) further allows identification of the denoiser as the proximal operator of a possibly nonconvex, weakly convex function, reinforcing the fixed-point interpretation of the resulting algorithms (Wei et al., 13 May 2025).
4. Practical Architectures, Empirical Calibration, and Extensions
Pseudo contractive properties can be embedded in both standard CNN denoisers (e.g., DRUNet, DnCNN) through spectral norm, Jacobian penalty, or holomorphic functional regularization; and in specialized architectures derived by deep-unfolding of contractive/averaged operators (for instance, via unrolled wavelet-thresholding or conjugate-gradient iterations (Nair et al., 2022, Hosseini et al., 10 Sep 2024)). The deployment strategy depends on the precise reconstruction task and regularization requirements:
- General nonlinear denoisers: Spectral or functional regularization imposed directly at training time, no significant architecture constraint (Wei et al., 8 Feb 2024).
- Unfolded/structured denoisers: Deep-unrolling of classical contractive schemes (e.g., wavelet or Laplacian regularized filters) ensures explicit averagedness/contractivity by architectural design (Nair et al., 2022, Hosseini et al., 10 Sep 2024).
- Conservative denoisers: Addition of Hamiltonian regularization to drive the Jacobian towards symmetry, making the operator a gradient field (Wei et al., 13 May 2025).
Pseudo contractive denoisers generalize to tensor completion, multi-modal inpainting, deblurring, super-resolution, Poisson inverse problems, video and hyperspectral image recovery, and are robust to both finite-alphabet discrete data and real-valued noisy measurements (Chen et al., 14 Oct 2025, Wei et al., 8 Feb 2024, Wei et al., 13 May 2025). Experimental results in these domains consistently demonstrate that DPC denoisers deliver superior restoration quality at low sampling rates and in challenging noise regimes.
5. Comparison and Relationship to Other Regularity Criteria
Pseudo contractiveness sits in a natural hierarchy:
- Firmly non-expansive ⊂ Averaged ⊂ Non-expansive ⊂ Pseudo contractive
- Cocoercive conservative denoisers generalize further, allowing Jacobians with spectrum outside the unit disk if the operator remains the gradient of a weakly convex potential (Wei et al., 13 May 2025).
Whereas standard non-expansive (or 1-Lipschitz) constraints can be too strong and empirically degrade denoising, pseudo contractiveness retains empirical denoising efficacy while allowing for convergence analysis in monotone-operator-based splitting, PnP, and half-quadratic frameworks. Compared to kernel denoisers, which are linear and analytically contractive in specialized norms (Sinha et al., 21 May 2025), deep pseudo contractive denoisers permit much richer, data-adaptive regularization.
6. Performance, Applications, and Future Directions
The empirical evidence supports the utility of deep pseudo contractive denoisers in high-dimensional inverse problems:
- Quantitative results: PnP methods with DPC denoisers consistently achieve strong PSNR/MSSIM on benchmarks, with marginal (<0.2–0.5 dB) loss relative to unconstrained denoisers but with global algorithm convergence (Wei et al., 8 Feb 2024, Chen et al., 14 Oct 2025).
- Imaging applications: Dense and structured tensor completion, non-Gaussian denoising, low-sampling video recovery, spatio-temporal traffic imputation.
- Robustness: DPC denoisers exhibit resilience to noise model mismatch, sampling artifacts, and covariate shift (Hosseini et al., 10 Sep 2024).
Future work includes tighter integration of denoiser contractivity constraints with self-supervised or unsupervised denoising (e.g., SURE/eSURE, pseudo-label, or partial-linearity strategies), extensibility to non-commutative operator frameworks and learning in non-Hilbertian geometries, as well as refined holomorphic or functional calculus constraints for even greater theoretical flexibility.
7. Summary Table: Operator Regularity Classes for Denoisers
| Regularity Class | Mathematical Criterion | Empirical Impact |
|---|---|---|
| Firmly Non-expansive | Strongest guarantee, limits denoising power | |
| Averaged (α-averaged) | , non-expansive, | Good guarantee, moderately strict |
| Non-expansive | Standard criterion, restricts capacity | |
| Pseudo Contractive (k) | , | Weakest, supports convergence and effective denoising |
| Cocoercive Conservative | , | Allows residual expansion, preserves strong denoising (Wei et al., 13 May 2025) |
In conclusion, deep pseudo contractive denoisers constitute a flexible, theoretically justified approach for integrating deep learning–based denoisers with modern monotone splitting and inverse-problem solvers. They achieve a balance between empirical performance and global algorithmic convergence across diverse image, video, and tensor recovery tasks, enabling next-generation plug-and-play and regularization-by-denoising methods to blend the strengths of deep learning with operator-theoretic rigor.