Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
133 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Comparative Study of Variational Autoencoders, Normalizing Flows, and Score-based Diffusion Models for Electrical Impedance Tomography (2310.15831v3)

Published 24 Oct 2023 in eess.IV

Abstract: Electrical Impedance Tomography (EIT) is a widely employed imaging technique in industrial inspection, geophysical prospecting, and medical imaging. However, the inherent nonlinearity and ill-posedness of EIT image reconstruction present challenges for classical regularization techniques, such as the critical selection of regularization terms and the lack of prior knowledge. Deep generative models (DGMs) have been shown to play a crucial role in learning implicit regularizers and prior knowledge. This study aims to investigate the potential of three DGMs-variational autoencoder networks, normalizing flow, and score-based diffusion model-to learn implicit regularizers in learning-based EIT imaging. We first introduce background information on EIT imaging and its inverse problem formulation. Next, we propose three algorithms for performing EIT inverse problems based on corresponding DGMs. Finally, we present numerical and visual experiments, which reveal that (1) no single method consistently outperforms the others across all settings, and (2) when reconstructing an object with 2 anomalies using a well-trained model based on a training dataset containing 4 anomalies, the conditional normalizing flow model (CNF) exhibits the best generalization in low-level noise, while the conditional score-based diffusion model (CSD*) demonstrates the best generalization in high-level noise settings. We hope our preliminary efforts will encourage other researchers to assess their DGMs in EIT and other nonlinear inverse problems.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (35)
  1. Electrical Impedance Tomography: Methods, History and Applications. CRC Press, 2021.
  2. Guided image generation with conditional invertible neural networks. arXiv preprint arXiv:1907.02392, 2019.
  3. Strong solutions for pde-based tomography by unsupervised learning. SIAM Journal on Imaging Sciences, 14(1):128–155, 2021.
  4. Bayesian inversion for nonlinear imaging models using deep generative priors. IEEE Transactions on Computational Imaging, 8:1237–1249, 2022.
  5. Missing cone artifact removal in odt using unsupervised deep learning in the projection domain. IEEE Transactions on Computational Imaging, 7:747–758, 2021. doi: 10.1109/TCI.2021.3098937.
  6. Come-closer-diffuse-faster: Accelerating conditional diffusion models for inverse problems through stochastic contraction. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 12413–12422, 2022.
  7. Diffusion posterior sampling for general noisy inverse problems. In The Eleventh International Conference on Learning Representations, 2023. URL https://openreview.net/forum?id=OnD9zGAGT0k.
  8. Learning nonlinear electrical impedance tomography. Journal of Scientific Computing, 90(1):1–23, 2022.
  9. Conditional invertible neural networks for medical imaging. Journal of Imaging, 7(11):243, 2021.
  10. NICE: non-linear independent components estimation. In Yoshua Bengio and Yann LeCun, editors, 3rd International Conference on Learning Representations, ICLR 2015, San Diego, CA, USA, May 7-9, 2015, Workshop Track Proceedings, 2015. URL http://arxiv.org/abs/1410.8516.
  11. Density estimation using real NVP. In International Conference on Learning Representations, 2017. URL https://openreview.net/forum?id=HkpbnH9lx.
  12. Sparsity reconstruction in electrical impedance tomography: an experimental evaluation. Journal of Computational and Applied Mathematics, 236(8):2126–2136, 2012.
  13. Generative adversarial networks. Communications of the ACM, 63(11):139–144, 2020.
  14. Representations of knowledge in complex systems. Journal of the Royal Statistical Society: Series B (Methodological), 56(4):549–581, 1994.
  15. Construct deep neural networks based on direct sampling methods for solving electrical impedance tomography. SIAM Journal on Scientific Computing, 43(3):B678–B711, 2021.
  16. Physics-embedded machine learning for electromagnetic data imaging: Examining three types of data-driven imaging methods. IEEE Signal Processing Magazine, 40(2):18–31, 2023.
  17. Graph convolutional networks for model-based learning in nonlinear inverse problems. IEEE Transactions on Computational Imaging, 7:1341–1353, 2021.
  18. JPEG artifact correction using denoising diffusion restoration models. In NeurIPS 2022 Workshop on Score-Based Methods, 2022. URL https://openreview.net/forum?id=O3WJOt79289.
  19. Auto-encoding variational bayes. In International Conference on Learning Representations, 2014. URL https://openreview.net/forum?id=33X9fd2-9FyZd.
  20. Dongzhuo Li. Differentiable gaussianization layers for inverse problems regularized by deep generative models. arXiv e-prints, pages arXiv–2112, 2021.
  21. pyeit: A python based framework for electrical impedance tomography. SoftwareX, 7:304–308, 2018.
  22. Giorgio Parisi. Correlation functions and computer simulations. Nuclear Physics B, 180(3):378–384, 1981.
  23. Image super-resolution via iterative refinement. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2022.
  24. A learning-based method for solving ill-posed nonlinear inverse problems: a simulation study of lung eit. SIAM journal on Imaging Sciences, 12(3):1275–1295, 2019.
  25. On conditioning the input noise for controlled image generation with diffusion models. arXiv preprint arXiv:2205.03859, 2022.
  26. Statistical and computational inverse problems. Applied Mathematical Sciences, 160, 2004.
  27. Pseudoinverse-guided diffusion models for inverse problems. In International Conference on Learning Representations, 2023. URL https://openreview.net/forum?id=9_gsMA8MRKQ.
  28. Maximum likelihood training of score-based diffusion models. In A. Beygelzimer, Y. Dauphin, P. Liang, and J. Wortman Vaughan, editors, Advances in Neural Information Processing Systems, 2021a. URL https://openreview.net/forum?id=AklttWFnxS9.
  29. Score-based generative modeling through stochastic differential equations. In International Conference on Learning Representations, 2021b. URL https://openreview.net/forum?id=PxTIG12RRHS.
  30. Solving inverse problems in medical imaging with score-based generative models. In International Conference on Learning Representations, 2022. URL https://openreview.net/forum?id=vaRCHVj0uGI.
  31. Pascal Vincent. A connection between score matching and denoising autoencoders. Neural computation, 23(7):1661–1674, 2011.
  32. A pre-iteration method for the inverse problem in electrical impedance tomography. IEEE Transactions on Instrumentation and Measurement, 53(4):1093–1096, 2004a. doi: 10.1109/TIM.2004.831180.
  33. Image quality assessment: from error visibility to structural similarity. IEEE transactions on image processing, 13(4):600–612, 2004b.
  34. Learning likelihoods with conditional normalizing flows, 2020. URL https://openreview.net/forum?id=rJg3zxBYwH.
  35. Supervised descent learning for thoracic electrical impedance tomography. IEEE Transactions on Biomedical Engineering, 68(4):1360–1369, 2020.
Citations (6)

Summary

We haven't generated a summary for this paper yet.