Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
194 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Statistical Component Separation for Targeted Signal Recovery in Noisy Mixtures (2306.15012v3)

Published 26 Jun 2023 in stat.ML, astro-ph.IM, cs.LG, and eess.SP

Abstract: Separating signals from an additive mixture may be an unnecessarily hard problem when one is only interested in specific properties of a given signal. In this work, we tackle simpler "statistical component separation" problems that focus on recovering a predefined set of statistical descriptors of a target signal from a noisy mixture. Assuming access to samples of the noise process, we investigate a method devised to match the statistics of the solution candidate corrupted by noise samples with those of the observed mixture. We first analyze the behavior of this method using simple examples with analytically tractable calculations. Then, we apply it in an image denoising context employing 1) wavelet-based descriptors, 2) ConvNet-based descriptors on astrophysics and ImageNet data. In the case of 1), we show that our method better recovers the descriptors of the target data than a standard denoising method in most situations. Additionally, despite not constructed for this purpose, it performs surprisingly well in terms of peak signal-to-noise ratio on full signal reconstruction. In comparison, representation 2) appears less suitable for image denoising. Finally, we extend this method by introducing a diffusive stepwise algorithm which gives a new perspective to the initial method and leads to promising results for image denoising under specific circumstances.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (27)
  1. Task adapted reconstruction for inverse problems. Inverse Problems, 38(7):075006, July 2022. doi: 10.1088/1361-6420/ac28ec.
  2. Sharpening Sparse Regularizers via Smoothing. IEEE Open Journal of Signal Processing, 2:396–409, 2021. doi: 10.1109/OJSP.2021.3104497.
  3. New interpretable statistics for large-scale structure analysis and generation. Physical Review D, 102:103506, November 2020. doi: 10.1103/PhysRevD.102.103506.
  4. Separation of dust emission from the cosmic infrared background in Herschel observations with wavelet phase harmonics. Astronomy & Astrophysics, 681:A1, January 2024. doi: 10.1051/0004-6361/202346814.
  5. Invariant scattering convolution networks. IEEE Transactions on Pattern Analysis and Machine Intelligence, 35(8):1872–1886, 2013. doi: 10.1109/TPAMI.2012.230.
  6. Multiscale sparse microcanonical models. Mathematical Statistics and Learning, 1(3):257–315, 2019. doi: 10.4171/msl/7.
  7. A Limited Memory Algorithm for Bound Constrained Optimization. SIAM Journal on Scientific Computing, 16(5):1190–1208, 1995. doi: 10.1137/0916069.
  8. Jean-François Cardoso. Blind signal separation: statistical principles. Proceedings of the IEEE, 86(10):2009–2025, 1998. doi: 10.1109/5.720250.
  9. Image denoising by sparse 3D transform-domain collaborative filtering. IEEE Transactions on Image Processing, 16(8):2080 – 2095, 2007. ISSN 22195491.
  10. Non-Gaussian modelling and statistical denoising of Planck dust polarisation full-sky maps using scattering transforms. Astronomy & Astrophysics, 668:A122, 2022. doi: 10.1051/0004-6361/202244566.
  11. Imagenet: A large-scale hierarchical image database. In 2009 IEEE Conference on Computer Vision and Pattern Recognition, pp.  248–255, 2009. doi: 10.1109/CVPR.2009.5206848.
  12. Image Denoising: The Deep Learning Revolution and Beyond—A Survey Paper. SIAM Journal on Imaging Sciences, 16(3):1594–1654, 2023. doi: 10.1137/23M1545859.
  13. Denoising Diffusion Probabilistic Models. In Advances in Neural Information Processing Systems, 2020.
  14. Task-Driven Dictionary Learning. IEEE Transactions on Pattern Analysis and Machine Intelligence, 34(4):791–804, 2012. doi: 10.1109/TPAMI.2011.156.
  15. Stéphane Mallat. Understanding deep convolutional networks. Philosophical Transactions of the Royal Society of London Series A, 374(2065):20150203, April 2016. doi: 10.1098/rsta.2015.0203.
  16. Phase harmonic correlations and convolutional neural networks. Information and Inference: A Journal of the IMA, 9(3):721–747, 11 2019. doi: 10.1093/imaiai/iaz019.
  17. Collaborative Filtering of Correlated Noise: Exact Transform-Domain Variance for Improved Shrinkage and Patch Matching. IEEE Transactions on Image Processing, 29:8339–8354, 2020. doi: 10.1109/TIP.2020.3014721.
  18. A new approach for the statistical denoising of Planck interstellar dust polarization data. Astronomy & Astrophysics, 649:L18, 2021. doi: 10.1051/0004-6361/202140503.
  19. Generative Models of Multichannel Data from a Single Example-Application to Dust Emission. The Astrophysical Journal, 943(1):9, January 2023. doi: 10.3847/1538-4357/aca538.
  20. Ivan Selesnick. Penalty and Shrinkage Functions for Sparse Signal Processing. Connexions, 11.22:23, 2012.
  21. Unearthing InSights into Mars: Unsupervised Source Separation with Limited Data. In International Conference on Machine Learning, 2023.
  22. Very deep convolutional networks for large-scale image recognition. In International Conference on Learning Representations, 2015.
  23. Score-Based Generative Modeling through Stochastic Differential Equations. In International Conference on Learning Representations, 2021.
  24. Image decomposition via the combination of sparse representations and a variational approach. IEEE Transactions on Image Processing, 14(10):1570–1582, 2005. doi: 10.1109/TIP.2005.852206.
  25. The Quijote Simulations. The Astrophysical Journal Supplement Series, 250(1):2, sep 2020. doi: 10.3847/1538-4365/ab9d82.
  26. Maximum entropy models from phase harmonic covariances. Applied and Computational Harmonic Analysis, 53:199–230, 2021. doi: https://doi.org/10.1016/j.acha.2021.01.003.
  27. Algorithm 778: L-BFGS-B: Fortran Subroutines for Large-Scale Bound-Constrained Optimization. ACM Transactions on Mathematical Software, 23(4):550–560, 1997. doi: 10.1145/279232.279236.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets