Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Variational Bayes image restoration with compressive autoencoders (2311.17744v3)

Published 29 Nov 2023 in cs.CV and stat.ML

Abstract: Regularization of inverse problems is of paramount importance in computational imaging. The ability of neural networks to learn efficient image representations has been recently exploited to design powerful data-driven regularizers. While state-of-the-art plug-and-play methods rely on an implicit regularization provided by neural denoisers, alternative Bayesian approaches consider Maximum A Posteriori (MAP) estimation in the latent space of a generative model, thus with an explicit regularization. However, state-of-the-art deep generative models require a huge amount of training data compared to denoisers. Besides, their complexity hampers the optimization involved in latent MAP derivation. In this work, we first propose to use compressive autoencoders instead. These networks, which can be seen as variational autoencoders with a flexible latent prior, are smaller and easier to train than state-of-the-art generative models. As a second contribution, we introduce the Variational Bayes Latent Estimation (VBLE) algorithm, which performs latent estimation within the framework of variational inference. Thanks to a simple yet efficient parameterization of the variational posterior, VBLE allows for fast and easy (approximate) posterior sampling.Experimental results on image datasets BSD and FFHQ demonstrate that VBLE reaches similar performance than state-of-the-art plug-and-play methods, while being able to quantify uncertainties significantly faster than other existing posterior sampling techniques.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (63)
  1. Nonlinear total variation based noise removal algorithms. Physica D: Nonlinear Phenomena, 1992.
  2. A. Tikhonov. Solution of incorrectly formulated problems and the regularization method. Soviet Math., 1963.
  3. Robert Tibshirani. Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society: Series B (Methodological), 1996.
  4. Michael Elad. Sparse and Redundant Representations: From Theory to Applications in Signal and Image Processing, volume 2. Springer, 2010.
  5. Stephane Mallat. Wavelet tour of signal processing: The sparse way, 2008.
  6. Learning a deep convolutional network for image super-resolution. European Conference on Computer Vision (ECCV), pages 184–199, 2014.
  7. Image super-resolution via iterative refinement. IEEE Transactions on Pattern Analysis and Machine Intelligence, 45(4):4713–4726, 2022.
  8. Plug-and-play priors for model based reconstruction. IEEE Global Conference on Signal and Information Processing (GlobalSIP), 2013.
  9. Plug-and-play ADMM for image restoration: Fixed-point convergence and applications. IEEE Transactions on Computational Imaging, 3(1):84–98, 2016.
  10. Plug-and-play image restoration with deep denoiser prior. IEEE Transactions on Pattern Analysis and Machine Intelligence, 44(10):6360–6376, 2021.
  11. Gradient step denoiser for convergent plug-and-play. In International Conference on Learning Representations (ICLR), 2022.
  12. A plug-and-play priors approach for solving nonlinear imaging inverse problems. IEEE Signal Processing Letters, 24(12):1872–1876, 2017.
  13. Learning deep CNN denoiser prior for image restoration. In IEEE conference on computer vision and pattern recognition (CVPR), pages 3929–3938, 2017.
  14. Denoising diffusion probabilistic models. Advances in Neural Information Processing Systems (NeurIPS), 2020.
  15. Improved denoising diffusion probabilistic models. In International Conference on Machine Learning (ICML), pages 8162–8171. PMLR, 2021.
  16. Denoising diffusion restoration models. Advances in Neural Information Processing Systems (NeurIPS), 35:23593–23606, 2022.
  17. Diffusion posterior sampling for general noisy inverse problems. In International Conference on Learning Representations (ICLR), 2023.
  18. Denoising diffusion models for plug-and-play image restoration. In IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pages 1219–1229, 2023.
  19. Compressed sensing using generative models. In International Conference on Machine Learning (ICML), 2017.
  20. Modeling sparse deviations for compressed sensing using generative models. In International Conference on Machine Learning (ICLR), pages 1214–1223. PMLR, 2018.
  21. Intermediate layer optimization for inverse problems using deep generative models. In NeurIPS 2020 Workshop on Deep Learning and Inverse Problems, 2020.
  22. Solving inverse problems by joint posterior maximization with autoencoding prior. SIAM Journal on Imaging Sciences, 15(2):822–859, 2022.
  23. Regularising inverse problems with generative machine learning models. Journal of Mathematical Imaging and Vision, 2023.
  24. VAEs with structured image covariance applied to compressed sensing mri. Physics in Medicine & Biology, 68(16):165008, 2023.
  25. NICE: Non-linear independent components estimation. In International Conference of Learning Representations (ICLR), 2015.
  26. Invertible generative models for inverse problems: mitigating representation error and dataset bias. In International Conference on Machine Learning (ICML), pages 399–409. PMLR, 2020.
  27. Regularization via deep generative models: an analysis point of view. In International Conference on Image Processing (ICIP), pages 404–408. IEEE, 2021.
  28. Your local GAN: Designing two dimensional local attention mechanisms for generative models. In IEEE/CVF conference on computer vision and pattern recognition (CVPR), pages 14531–14539, 2020.
  29. Inverse problem regularization with hierarchical variational autoencoders. In IEEE International Conference on Computer Vision (ICCV), 2023.
  30. Ladder variational autoencoders. In D. Lee, M. Sugiyama, U. Luxburg, I. Guyon, and R. Garnett, editors, Advances in Neural Information Processing Systems, volume 29. Curran Associates, Inc., 2016.
  31. Stochastic solutions for linear inverse problems using the prior implicit in a denoiser. Advances in Neural Information Processing Systems, 34:13242–13254, 2021.
  32. Efficient bayesian computation by proximal Markov chain Monte Carlo: when Langevin meets Moreau. SIAM Journal on Imaging Sciences, 11(1):473–506, 2018.
  33. Bayesian Imaging using plug & play priors: when Langevin meets Tweedie. SIAM Journal on Imaging Sciences, 15(2):701–737, 2022.
  34. NF-ULA: Langevin Monte Carlo with Normalizing Flow Prior for Imaging Inverse Problems. arXiv preprint arXiv:2304.08342, 2023.
  35. Bayesian imaging with data-driven priors encoded by neural networks. SIAM Journal on Imaging Sciences, 15(2):892–924, 2022.
  36. Plug-and-play split gibbs sampler: embedding deep generative priors in bayesian inference. arXiv preprint arXiv:2304.11134, 2023.
  37. Accelerating Proximal Markov Chain Monte Carlo by Using an Explicit Stabilized Method. SIAM Journal on Imaging Sciences, 13(2):905–935, 2020.
  38. End-to-end optimized image compression. In International Conference of Learning Representations (ICLR), 2017.
  39. Variationnal image compression with a scale hyperprior. In International Conference of Learning Representations (ICLR), 2018.
  40. A style-based generator architecture for generative adversarial networks. In IEEE/CVF conference on computer vision and pattern recognition (CVPR), pages 4401–4410, 2019.
  41. A database of human segmented natural images and its application to evaluating segmentation algorithms and measuring ecological statistics. In IEEE International Conference on Computer Vision (ICCV), volume 2, pages 416–423 vol.2, 2001.
  42. Auto-encoding variational bayes. In International Conference on Learning Representations (ICLR), 2014.
  43. Variational inference: A review for statisticians. Journal of the American statistical Association, 112(518):859–877, 2017.
  44. Paul A Wintz. Transform picture coding. Proceedings of the IEEE, 60(7):809–820, 1972.
  45. The jpeg 2000 still image compression standard. IEEE Signal processing magazine, 18(5):36–58, 2001.
  46. Joint autoregressive and hierarchical priors for learned image compression. In Advances in neural information processing systems (NeurIPS), volume 31, 2018.
  47. Learned image compression with discretized gaussian mixture likelihoods and attention modules. In IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2020.
  48. Alan C Bovik. Handbook of image and video processing. Academic press, 2010.
  49. Deep-learned regularization and proximal operator for image compressive sensing. IEEE Transactions on Image Processing, 30:7112–7126, 2021.
  50. Diagnosing and enhancing vae models. International Conference of Learning Representations, 2019.
  51. Satellite image compression and denoising with neural networks. IEEE Geoscience and Remote Sensing Letters, 19:1–5, 2022.
  52. Density modeling of images using a generalized normalization transformation. In International Conference on Learning Representations (ICLR), 2016.
  53. Variational sparse coding. 2020.
  54. Bayesian image super-resolution with deep modeling of image statistics. IEEE Transactions on Pattern Analysis and Machine Intelligence, 45(2):1405–1423, 2022.
  55. Solving bayesian inverse problems via variational autoencoders. arXiv preprint arXiv:1912.04212, 2019.
  56. High-fidelity generative image compression. In H. Larochelle, M. Ranzato, R. Hadsell, M.F. Balcan, and H. Lin, editors, Advances in Neural Information Processing Systems, volume 33, pages 11913–11924. Curran Associates, Inc., 2020.
  57. Image quality assessment: from error visibility to structural similarity. IEEE transactions on image processing, 13(4):600–612, 2004.
  58. The unreasonable effectiveness of deep features as a perceptual metric. In IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pages 586–595, 2018.
  59. CompressAI: a PyTorch library and evaluation platform for end-to-end compression research. arXiv preprint arXiv:2011.03029, 2020.
  60. ILVR: Conditioning method for denoising diffusion probabilistic models. In IEEE/CVF International Conference on Computer Vision (ICCV), pages 14347–14356, 2021.
  61. The perception-distortion tradeoff. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 6228–6237, 2018.
  62. Diffusion models beat GANs on image synthesis. Advances in Neural Information Processing Systems (NeurIPS), 34:8780–8794, 2021.
  63. Variable-rate deep image compression through spatially-adaptive feature transform. In IEEE/CVF International Conference on Computer Vision (ICCV), pages 2380–2389, 2021.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Maud Biquard (2 papers)
  2. Marie Chabert (8 papers)
  3. Thomas Oberlin (27 papers)
  4. Florence Genin (4 papers)
  5. Christophe Latry (3 papers)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets