Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Finding Mixed Nash Equilibria of Generative Adversarial Networks (1811.02002v1)

Published 23 Oct 2018 in cs.LG, cs.GT, and stat.ML

Abstract: We reconsider the training objective of Generative Adversarial Networks (GANs) from the mixed Nash Equilibria (NE) perspective. Inspired by the classical prox methods, we develop a novel algorithmic framework for GANs via an infinite-dimensional two-player game and prove rigorous convergence rates to the mixed NE, resolving the longstanding problem that no provably convergent algorithm exists for general GANs. We then propose a principled procedure to reduce our novel prox methods to simple sampling routines, leading to practically efficient algorithms. Finally, we provide experimental evidence that our approach outperforms methods that seek pure strategy equilibria, such as SGD, Adam, and RMSProp, both in speed and quality.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Ya-Ping Hsieh (23 papers)
  2. Chen Liu (206 papers)
  3. Volkan Cevher (216 papers)
Citations (83)

Summary

We haven't generated a summary for this paper yet.