Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Probabilistic Matching of Real and Generated Data Statistics in Generative Adversarial Networks (2306.10943v3)

Published 19 Jun 2023 in stat.ML and cs.LG

Abstract: Generative adversarial networks constitute a powerful approach to generative modeling. While generated samples often are indistinguishable from real data, there is no guarantee that they will follow the true data distribution. For scientific applications in particular, it is essential that the true distribution is well captured by the generated distribution. In this work, we propose a method to ensure that the distributions of certain generated data statistics coincide with the respective distributions of the real data. In order to achieve this, we add a new loss term to the generator loss function, which quantifies the difference between these distributions via suitable f-divergences. Kernel density estimation is employed to obtain representations of the true distributions, and to estimate the corresponding generated distributions from minibatch values at each iteration. When compared to other methods, our approach has the advantage that the complete shapes of the distributions are taken into account. We evaluate the method on a synthetic dataset and a real-world dataset and demonstrate improved performance of our approach.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (38)
  1. Generative adversarial nets. In Z. Ghahramani, M. Welling, C. Cortes, N. Lawrence, and K.Q. Weinberger, editors, Advances in Neural Information Processing Systems, volume 27. Curran Associates, Inc., 2014.
  2. How generative adversarial networks and their variants work: An overview. ACM Computing Surveys (CSUR), 52(1):1–43, 2019.
  3. A review on generative adversarial networks: Algorithms, theory, and applications. IEEE transactions on knowledge and data engineering, 2021.
  4. Generative adversarial text to image synthesis. In International conference on machine learning, pages 1060–1069. PMLR, 2016.
  5. Image-to-image translation with conditional adversarial networks. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 1125–1134, 2017.
  6. Photo-realistic single image super-resolution using a generative adversarial network. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 4681–4690, 2017.
  7. Deep multi-scale video prediction beyond mean square error. 2016. 4th International Conference on Learning Representations, ICLR 2016.
  8. Generative adversarial networks (GANs) challenges, solutions, and future directions. ACM Computing Surveys (CSUR), 54(3):1–42, 2021.
  9. Fast cosmic web simulations with generative adversarial networks. Computational Astrophysics and Cosmology, 5(1):1–11, 2018.
  10. The CAMELS project: Cosmology and astrophysics with machine-learning simulations. The Astrophysical Journal, 915(1):71, 2021.
  11. Simulation of electron-proton scattering events by a feature-augmented and transformed generative adversarial network (FAT-GAN). In Zhi-Hua Zhou, editor, Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence, IJCAI-21, pages 2126–2132, 2021.
  12. Accelerating science with generative adversarial networks: an application to 3d particle showers in multilayer calorimeters. Physical review letters, 120(4):042003, 2018.
  13. Evaluating generative models in high energy physics. Physical Review D, 107(7):076017, 2023.
  14. Bernard W Silverman. Density estimation for statistics and data analysis, volume 26. CRC press, 1986.
  15. Physics-informed machine learning. Nature Reviews Physics, 3(6):422–440, 2021.
  16. Scientific machine learning through physics–informed neural networks: where we are and what’s next. Journal of Scientific Computing, 92(3):88, 2022.
  17. Enforcing constraints for interpolation and extrapolation in generative adversarial networks. Journal of Computational Physics, 397:108844, 2019.
  18. Highly-scalable, physics-informed GANs for learning solutions of stochastic PDEs. In 2019 IEEE/ACM Third Workshop on Deep Learning on Supercomputers (DLS), pages 1–11. IEEE, 2019.
  19. Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Computational physics, 378:686–707, 2019.
  20. Three dimensional energy parametrized generative adversarial networks for electromagnetic shower simulation. In 2018 25th IEEE International Conference on Image Processing (ICIP), pages 3913–3917, 2018.
  21. Particle detector simulation using generative adversarial networks with domain related constraints. In 2019 18th IEEE International Conference On Machine Learning And Applications (ICMLA), pages 28–33, 2019.
  22. Enforcing imprecise constraints on generative adversarial networks for emulating physical systems. Communications in Computational Physics, 30(3):635–665, 2021.
  23. Enforcing statistical constraints in generative adversarial networks for modeling chaotic dynamical systems. Journal of Computational Physics, 406:109209, 2020.
  24. Wasserstein generative adversarial networks. In International conference on machine learning, pages 214–223. PMLR, 2017.
  25. Improved training of Wasserstein gans. Advances in neural information processing systems, 30, 2017.
  26. Spectral normalization for generative adversarial networks. arXiv preprint arXiv:1802.05957, 2018.
  27. GANs trained by a two time-scale update rule converge to a local Nash equilibrium. Advances in neural information processing systems, 30, 2017.
  28. Assessing generative models via precision and recall. Advances in neural information processing systems, 31, 2018.
  29. Improved precision and recall metric for assessing generative models. Advances in Neural Information Processing Systems, 32, 2019.
  30. Adam: A method for stochastic optimization. In Proceedings of International Conference on Learning Representations (ICLR), 2015.
  31. The IceCube neutrino observatory: instrumentation and online systems. Journal of Instrumentation, 12(03):P03012, 2017.
  32. IceCube-Gen2: the window to the extreme universe. Journal of Physics G: Nuclear and Particle Physics, 48(6):060501, 2021.
  33. Deep-learning-based reconstruction of the neutrino direction and energy for in-ice radio detectors. Astroparticle Physics, 145:102781, 2023.
  34. Anton Holmberg. Fast simulations of radio neutrino detectors: Using generative adversarial networks and artificial neural networks, 2022.
  35. NuRadioMC: Simulating the radio emission of neutrinos from interaction to detector. The European Physical Journal C, 80:1–35, 2020.
  36. Čerenkov radio pulses from electromagnetic showers in the time domain. Physical Review D, 81(12):123009, 2010.
  37. Auto-Encoding Variational Bayes. In 2nd International Conference on Learning Representations, ICLR 2014, Banff, AB, Canada, April 14-16, 2014, Conference Track Proceedings, 2014.
  38. Denoising diffusion probabilistic models. Advances in Neural Information Processing Systems, 33:6840–6851, 2020.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets