Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Convolutional Analysis Operator Learning by End-To-End Training of Iterative Neural Networks (2203.02166v1)

Published 4 Mar 2022 in eess.IV, cs.CV, and eess.SP

Abstract: The concept of sparsity has been extensively applied for regularization in image reconstruction. Typically, sparsifying transforms are either pre-trained on ground-truth images or adaptively trained during the reconstruction. Thereby, learning algorithms are designed to minimize some target function which encodes the desired properties of the transform. However, this procedure ignores the subsequently employed reconstruction algorithm as well as the physical model which is responsible for the image formation process. Iterative neural networks - which contain the physical model - can overcome these issues. In this work, we demonstrate how convolutional sparsifying filters can be efficiently learned by end-to-end training of iterative neural networks. We evaluated our approach on a non-Cartesian 2D cardiac cine MRI example and show that the obtained filters are better suitable for the corresponding reconstruction algorithm than the ones obtained by decoupled pre-training.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Andreas Kofler (16 papers)
  2. Christian Wald (11 papers)
  3. Tobias Schaeffter (9 papers)
  4. Markus Haltmeier (104 papers)
  5. Christoph Kolbitsch (13 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.