Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Improving GANs with a Feature Cycling Generator (2210.09638v2)

Published 18 Oct 2022 in cs.CV and cs.AI

Abstract: Generative adversarial networks (GANs), built with a generator and discriminator, significantly have advanced image generation. Typically, existing papers build their generators by stacking up multiple residual blocks since it makes ease the training of generators. However, some papers commented on the limitation of the residual block and proposed a new architectural unit that improves the GANs performance. Following this trend, this paper presents a novel unit, called feature cycling block (FCB), which achieves impressive results in the image generation task. Specifically, the FCB has two branches: one is a memory branch and the other is an image branch. The memory branch keeps meaningful information at each stage of the generator, whereas the image branch takes some useful features from the memory branch to produce a high-quality image. To show the capability of the proposed method, we conducted extensive experiments using various datasets including CIFAR-10, CIFAR-100, FFHQ, AFHQ, and subsets of LSUN. Experimental results demonstrate the substantial superiority of our approach over the baseline without incurring any objective functions or training skills. For instance, the proposed method improves Frechet inception distance (FID) of StyleGAN2 from 4.89 to 3.72 on the FFHQ dataset and from 6.64 to 5.57 on the LSUN Bed dataset. We believe that the pioneering attempt presented in this paper could inspire the community with better-designed generator architecture and with training objectives or skills compatible with the proposed method.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Seung Park (11 papers)
  2. Yong-Goo Shin (13 papers)

Summary

We haven't generated a summary for this paper yet.