Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Solving Inverse Problems with Conditional-GAN Prior via Fast Network-Projected Gradient Descent (2109.01105v1)

Published 2 Sep 2021 in cs.LG, cs.IT, eess.SP, math.IT, and stat.ML

Abstract: The projected gradient descent (PGD) method has shown to be effective in recovering compressed signals described in a data-driven way by a generative model, i.e., a generator which has learned the data distribution. Further reconstruction improvements for such inverse problems can be achieved by conditioning the generator on the measurement. The boundary equilibrium generative adversarial network (BEGAN) implements an equilibrium based loss function and an auto-encoding discriminator to better balance the performance of the generator and the discriminator. In this work we investigate a network-based projected gradient descent (NPGD) algorithm for measurement-conditional generative models to solve the inverse problem much faster than regular PGD. We combine the NPGD with conditional GAN/BEGAN to evaluate their effectiveness in solving compressed sensing type problems. Our experiments on the MNIST and CelebA datasets show that the combination of measurement conditional model with NPGD works well in recovering the compressed signal while achieving similar or in some cases even better performance along with a much faster reconstruction. The achieved reconstruction speed-up in our experiments is up to 140-175.

Citations (1)

Summary

We haven't generated a summary for this paper yet.