Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

QN-Mixer: A Quasi-Newton MLP-Mixer Model for Sparse-View CT Reconstruction (2402.17951v3)

Published 28 Feb 2024 in eess.IV and cs.CV

Abstract: Inverse problems span across diverse fields. In medical contexts, computed tomography (CT) plays a crucial role in reconstructing a patient's internal structure, presenting challenges due to artifacts caused by inherently ill-posed inverse problems. Previous research advanced image quality via post-processing and deep unrolling algorithms but faces challenges, such as extended convergence times with ultra-sparse data. Despite enhancements, resulting images often show significant artifacts, limiting their effectiveness for real-world diagnostic applications. We aim to explore deep second-order unrolling algorithms for solving imaging inverse problems, emphasizing their faster convergence and lower time complexity compared to common first-order methods like gradient descent. In this paper, we introduce QN-Mixer, an algorithm based on the quasi-Newton approach. We use learned parameters through the BFGS algorithm and introduce Incept-Mixer, an efficient neural architecture that serves as a non-local regularization term, capturing long-range dependencies within images. To address the computational demands typically associated with quasi-Newton algorithms that require full Hessian matrix computations, we present a memory-efficient alternative. Our approach intelligently downsamples gradient information, significantly reducing computational requirements while maintaining performance. The approach is validated through experiments on the sparse-view CT problem, involving various datasets and scanning protocols, and is compared with post-processing and deep unrolling state-of-the-art approaches. Our method outperforms existing approaches and achieves state-of-the-art performance in terms of SSIM and PSNR, all while reducing the number of unrolling iterations required.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (54)
  1. Learned primal-dual reconstruction. IEEE TMI, 37:1322–1332, 2018.
  2. Simultaneous algebraic reconstruction technique (SART): a superior implementation of the art algorithm. Ultrasonic imaging, 6:81–94, 1984.
  3. Attention is all you need. In NeurIPS, 2017.
  4. Principles of Computerized Tomographic Imaging. Society for Industrial and Applied Mathematics, 2001.
  5. Low-dose CT with a residual encoder-decoder convolutional neural network. IEEE TMI, 36:2524–2535, 2017.
  6. LEARN: Learned experts’ assessment-based reconstruction network for sparse-data CT. IEEE TMI, 37:1333–1347, 2018.
  7. Learned full-sampling reconstruction from incomplete data. IEEE TCI, pages 945–957, 2020.
  8. A Tikhonov Regularization Method for Image Reconstruction, pages 153–164. Springer US, 1993.
  9. William C. Davidon. Variable metric method for minimization. SIAM Journal on Optimization, 1:1–17, 1991.
  10. DIOR: Deep iterative optimization-based residual-learning for limited-angle CT reconstruction. IEEE TMI, pages 1778–1790, 2022.
  11. HUMUS-Net: Hybrid unrolled multi-scale network architecture for accelerated MRI reconstruction. In NeurIPS, 2022.
  12. Roger Fletcher. Practical Methods of Optimization. John Wiley & Sons, New York, NY, USA, 1987.
  13. Transformer-based Learned Optimization. In CVPR, pages 11970–11979, 2023.
  14. Deep learning-based sinogram completion for low-dose CT. In 2018 IEEE 13th Image, Video, and Multidimensional Signal Processing Workshop, pages 1–5, 2018.
  15. CNN-based projected gradient descent for consistent CT image reconstruction. IEEE TMI, 37:1440–1453, 2018.
  16. Hybrid-domain neural network processing for sparse-view CT reconstruction. IEEE TRPMS, 5:88–98, 2020.
  17. Deep convolutional neural network for inverse problems in imaging. IEEE TIP, 26:4509–4522, 2017.
  18. FISTA-Net: Learning a fast iterative shrinkage thresholding network for inverse problems in imaging. IEEE TMI, 40:1329–1339, 2021.
  19. Quasi-Newton methods. Numerical optimization, 75:135–163, 2006.
  20. Constrained iterative reconstruction by the conjugate gradient method. IEEE TMI, 4:65–71, 1985.
  21. Total deep variation for linear inverse problems. In CVPR, pages 7546–7555, 2020.
  22. Deep-neural-network-based sinogram synthesis for sparse-view CT image reconstruction. IEEE TRPMS, 3:109–119, 2018.
  23. Sparse-view CT reconstruction based on multi-level wavelet convolution neural network. Physica Medica, 80:352–362, 2020.
  24. DDPTransformer: Dual-domain with parallel transformer network for sparse view CT image reconstruction. IEEE TCI, pages 1–15, 2022.
  25. Learning to distill global representation for sparse-view ct. In ICCV, pages 21196–21207, 2023.
  26. DuDoNet: Dual domain network for CT metal artifact reduction. In CVPR, pages 10512–10521, 2019.
  27. Quasi-newton methods for saddle point problems. In NeurIPS, 2022.
  28. Swin transformer: Hierarchical vision transformer using shifted windows. In ICCV, pages 9992–10002, 2021.
  29. Decoupled Weight Decay Regularization. arXiv preprint arXiv:1711.05101, 2017.
  30. Adversarial regularizers in inverse problems. In NeurIPS, 2018.
  31. Cormack Allan Macleod. Representation of a function by its line integrals, with some radiological applications. Journal of Applied Physics, 34:2722–2727, 1963.
  32. C. McCollough. TU-FG-207A-04: Overview of the low dose CT grand challenge. Medical Physics, 43:3759–3760, 2016.
  33. Practical tradeoffs between memory, compute, and performance in learned optimizers. arXiv preprint arXiv:2203.11860, 2022.
  34. The alara principle in medical imaging. Philosophy, 44:595–600, 1983.
  35. End-to-end reconstruction meets data-driven regularization for inverse problems. In NeurIPS, 2021.
  36. Operator discretization library (ODL), 2014.
  37. Johann Radon. über die bestimmung von funktionen durch ihre integralwerte längs gewisser mannigfaltigkeiten. Berichte über die Verhandlungen der Königlich-Sächsischen Akademie der Wissenschaften zu Leipzig, 69:262–277, 1917.
  38. High-resolution image synthesis with latent diffusion models. In CVPR, pages 10684–10695, 2022.
  39. Singular value decomposition-based 2D image reconstruction for computed tomography. Journal of X-ray science and technology, 25:113–134, 2017.
  40. Sparse-view x-ray CT reconstruction via total generalized variation regularization. PMB, 59:2997–3017, 2014.
  41. End-to-End variational networks for accelerated MRI reconstruction. In MICCAI, pages 64–73, 2020.
  42. Going deeper with convolutions. In CVPR, pages 1–9, 2015.
  43. MLP-mixer: An all-MLP architecture for vision. In NeurIPS, 2021.
  44. Use of a Total Variation minimization iterative reconstruction algorithm to evaluate reduced projections during digital breast tomosynthesis. BioMed Research International, 2018:1–14, 2018.
  45. DuDoTrans: Dual-domain transformer for sparse-view CT reconstruction. In Machine Learning for Medical Image Reconstruction, pages 84–94. Springer International Publishing, 2022.
  46. An outlook on X-ray CT research and development. Medical Physics, 35:1051–1064, 2008.
  47. ADMM-based deep reconstruction for limited-angle CT. PMB, 64, 2019.
  48. DRONE: Dual-domain residual-based optimization network for sparse-view CT reconstruction. IEEE TMI, 40:3002–3014, 2021.
  49. Iterative low-dose ct reconstruction with priors trained by artificial neural network. IEEE TMI, 36:2479–2486, 2017.
  50. Transformer-based iterative reconstruction model for sparse-view CT reconstruction. In MICCAI, 2022.
  51. DeepLesion: Automated mining of large-scale lesion annotations and universal lesion detection with deep learning. Journal of Medical Imaging, 5:036501, 2018.
  52. LEARN++: Recurrent dual-domain reconstruction network for compressed sensing CT. IEEE TRPMS, 7:132–142, 2023.
  53. Fast quasi-newton algorithms for penalized reconstruction in emission tomography and further improvements via preconditioning. IEEE TMI, 37:1000–1010, 2018.
  54. A sparse-view CT reconstruction method based on combination of DenseNet and deconvolution. IEEE TMI, 37:1407–1417, 2018.
Citations (2)

Summary

We haven't generated a summary for this paper yet.