Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Information Compensation for Deep Conditional Generative Networks (2001.08559v3)

Published 23 Jan 2020 in cs.LG, cs.CV, and stat.ML

Abstract: In recent years, unsupervised/weakly-supervised conditional generative adversarial networks (GANs) have achieved many successes on the task of modeling and generating data. However, one of their weaknesses lies in their poor ability to separate, or disentangle, the different factors that characterize the representation encoded in their latent space. To address this issue, we propose a novel structure for unsupervised conditional GANs powered by a novel Information Compensation Connection (IC-Connection). The proposed IC-Connection enables GANs to compensate for information loss incurred during deconvolution operations. In addition, to quantify the degree of disentanglement on both discrete and continuous latent variables, we design a novel evaluation procedure. Our empirical results suggest that our method achieves better disentanglement compared to the state-of-the-art GANs in a conditional generation setting.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Zehao Wang (38 papers)
  2. Kaili Wang (17 papers)
  3. Tinne Tuytelaars (150 papers)
  4. Jose Oramas (30 papers)

Summary

We haven't generated a summary for this paper yet.