Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Robust MRI Reconstruction by Smoothed Unrolling (SMUG) (2312.07784v2)

Published 12 Dec 2023 in eess.IV, cs.AI, cs.CV, cs.LG, and eess.SP

Abstract: As the popularity of deep learning (DL) in the field of magnetic resonance imaging (MRI) continues to rise, recent research has indicated that DL-based MRI reconstruction models might be excessively sensitive to minor input disturbances, including worst-case additive perturbations. This sensitivity often leads to unstable, aliased images. This raises the question of how to devise DL techniques for MRI reconstruction that can be robust to train-test variations. To address this problem, we propose a novel image reconstruction framework, termed Smoothed Unrolling (SMUG), which advances a deep unrolling-based MRI reconstruction model using a randomized smoothing (RS)-based robust learning approach. RS, which improves the tolerance of a model against input noises, has been widely used in the design of adversarial defense approaches for image classification tasks. Yet, we find that the conventional design that applies RS to the entire DL-based MRI model is ineffective. In this paper, we show that SMUG and its variants address the above issue by customizing the RS process based on the unrolling architecture of a DL-based MRI reconstruction model. Compared to the vanilla RS approach, we show that SMUG improves the robustness of MRI reconstruction with respect to a diverse set of instability sources, including worst-case and random noise perturbations to input measurements, varying measurement sampling rates, and different numbers of unrolling steps. Furthermore, we theoretically analyze the robustness of our method in the presence of perturbations.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (42)
  1. “Compressed Sensing MRI,” IEEE signal processing magazine, 2008.
  2. “Low-complexity image denoising based on statistical modeling of wavelet coefficients,” IEEE Signal Processing Letters, vol. 6, no. 12, pp. 300–303, 1999.
  3. “An efficient algorithm for compressed MR imaging using total variation and wavelets,” in 2008 IEEE Conference on Computer Vision and Pattern Recognition, 2008, pp. 1–8.
  4. S. Ravishankar and Y. Bresler, “MR image reconstruction from highly undersampled k-space data by dictionary learning,” IEEE Transactions on Medical Imaging, vol. 30, no. 5, pp. 1028–1041, 2011.
  5. S. G. Lingala and M. Jacob, “Blind compressive sensing dynamic MRI,” IEEE Transactions on Medical Imaging, vol. 32, no. 6, pp. 1132–1145, 2013.
  6. S. Ravishankar and Y. Bresler, “Learning sparsifying transforms,” IEEE Transactions on Signal Processing, vol. 61, no. 5, pp. 1072–1086, 2012.
  7. “Image reconstruction: From sparsity to data-adaptive methods and machine learning,” Proceedings of the IEEE, vol. 108, no. 1, pp. 86–109, 2020.
  8. “Transform learning for magnetic resonance image reconstruction: From model-based learning to building neural networks,” IEEE Signal Processing Magazine, vol. 37, no. 1, pp. 41–53, 2020.
  9. “Sigma-net: Ensembled iterative deep neural networks for accelerated parallel MR image reconstruction,” arXiv preprint arXiv:1912.05480, 2019.
  10. “Deep dictionary-transform learning for image reconstruction,” in 2018 IEEE 15th International Symposium on Biomedical Imaging (ISBI 2018). IEEE, 2018, pp. 1208–1212.
  11. “Modl: Model-based deep learning architecture for inverse problems,” IEEE transactions on medical imaging, vol. 38, no. 2, pp. 394–405, 2018.
  12. “A deep cascade of convolutional neural networks for dynamic MR image reconstruction,” IEEE Trans. Med. Imaging, vol. 37, no. 2, pp. 491–503, Feb. 2018.
  13. “U-net: Convolutional networks for biomedical image segmentation,” in Medical Image Computing and Computer-Assisted Intervention – MICCAI 2015, 2015, pp. 234–241.
  14. “Cascaded dilated dense network with two-step data consistency for MRI reconstruction,” in NeurIPS, 2019.
  15. “A Deep Cascade of Convolutional Neural Networks for Dynamic MR Image Reconstruction,” IEEE Transactions on Medical Imaging, vol. 37, no. 2, pp. 491–503, 2018.
  16. “Deep ADMM-Net for compressive sensing MRI,” in Advances in Neural Information Processing Systems, 2016, pp. 10–18.
  17. “Learning a variational network for reconstruction of accelerated MRI data,” Magnetic resonance in medicine, vol. 79, no. 6, pp. 3055–3071, 2018.
  18. “The Little Engine That Could: Regularization by Denoising (RED),” SIAM Journal on Imaging Sciences, vol. 10, no. 4, pp. 1804–1844, 2017.
  19. “Plug-and-play unplugged: optimization-free reconstruction using consensus equilibrium,” SIAM J. Imaging Sci., vol. 11, no. 3, pp. 2001–20, Jan. 2018.
  20. “On instabilities of deep learning in image reconstruction and the potential costs of AI,” Proceedings of the National Academy of Sciences, 2020.
  21. “On Instabilities of Conventional Multi-Coil MRI Reconstruction to Small Adverserial Perturbations,” arXiv preprint arXiv:2102.13066, 2021.
  22. “Deep equilibrium architectures for inverse problems in imaging,” IEEE Transactions on Computational Imaging, vol. 7, pp. 1123–1133, 2021.
  23. “Towards deep learning models resistant to adversarial attacks,” arXiv preprint arXiv:1706.06083, 2017.
  24. “Theoretically principled trade-off between robustness and accuracy,” International Conference on Machine Learning, 2019.
  25. “Certified adversarial robustness via randomized smoothing,” in International Conference on Machine Learning. PMLR, 2019, pp. 1310–1320.
  26. “Denoised smoothing: A provable defense for pretrained classifiers,” Advances in Neural Information Processing Systems, vol. 33, 2020.
  27. “How to robustify black-box ml models? a zeroth-order optimization perspective,” arXiv preprint arXiv:2203.14195, 2022.
  28. A. Wolf, “Making medical image reconstruction adversarially robust,” 2019.
  29. “Smug: Towards robust mri reconstruction by smoothed unrolling,” 2023.
  30. “Rethinking pre-training and self-training,” Advances in Neural Information Processing Systems, vol. 33, 2020.
  31. D.L. Donoho, “Compressed sensing,” IEEE Transactions on Information Theory, vol. 52, no. 4, pp. 1289–1306, 2006.
  32. “MoDL: Model-based deep learning architecture for inverse problems,” IEEE Trans. Med. Imaging, vol. 38, no. 2, pp. 394–405, Feb. 2019.
  33. “Explaining and harnessing adversarial examples,” 2015 ICLR, vol. arXiv preprint arXiv:1412.6572, 2015.
  34. “On the robustness of deep learning-based MRI reconstruction to image transformations,” in Workshop on Trustworthy and Socially Responsible Machine Learning, NeurIPS 2022, 2022.
  35. “fastmri: An open dataset and benchmarks for accelerated mri,” arXiv preprint arXiv:1811.08839, 2018.
  36. “How to robustify black-box ML models? a zeroth-order optimization perspective,” in International Conference on Learning Representations, 2022.
  37. D.P. Kingma and J.Ba, “Adam: A method for stochastic optimization,” 2015 ICLR, vol. arXiv preprint arXiv:1412.6980, 2015.
  38. “Deep iterative down-up cnn for image denoising,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2019, pp. 0–0.
  39. “Generalized magnetic resonance image reconstruction using the berkeley advanced reconstruction toolbox,” in ISMRM Workshop on Data Sampling & Image Reconstruction, Sedona, AZ, 2016.
  40. D. P Kingma and J. Ba, “Adam: A method for stochastic optimization,” arXiv preprint arXiv:1412.6980, 2014.
  41. “Image quality assessment: From error visibility to structural similarity,” IEEE Trans. Image Process., vol. 13, no. 4, pp. 600–612, 2004.
  42. “Decentralized resource allocation in dynamic networks of agents,” SIAM Journal on Optimization, vol. 19, no. 2, pp. 911–940, 2008.

Summary

We haven't generated a summary for this paper yet.