Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
162 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Magnetic Resonance Image Processing Transformer for General Accelerated Image Reconstruction (2405.15098v2)

Published 23 May 2024 in eess.IV, cs.CV, cs.LG, and physics.med-ph

Abstract: Recent advancements in deep learning have enabled the development of generalizable models that achieve state-of-the-art performance across various imaging tasks. Vision Transformer (ViT)-based architectures, in particular, have demonstrated strong feature extraction capabilities when pre-trained on large-scale datasets. In this work, we introduce the Magnetic Resonance Image Processing Transformer (MR-IPT), a ViT-based framework designed to enhance the generalizability and robustness of accelerated MRI reconstruction. Unlike conventional deep learning models that require separate training for different acceleration factors, MR-IPT is pre-trained on a large-scale dataset encompassing multiple undersampling patterns and acceleration settings, enabling a unified reconstruction framework. By leveraging a shared transformer backbone, MR-IPT effectively learns universal feature representations, allowing it to generalize across diverse reconstruction tasks. Extensive experiments demonstrate that MR-IPT outperforms both CNN-based and existing transformer-based methods, achieving superior reconstruction quality across varying acceleration factors and sampling masks. Moreover, MR-IPT exhibits strong robustness, maintaining high performance even under unseen acquisition setups, highlighting its potential as a scalable and efficient solution for accelerated MRI. Our findings suggest that transformer-based general models can significantly advance MRI reconstruction, offering improved adaptability and stability compared to traditional deep learning approaches.

Citations (1)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com