Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

normflows: A PyTorch Package for Normalizing Flows (2302.12014v2)

Published 26 Jan 2023 in cs.LG

Abstract: Normalizing flows model probability distributions through an expressive tractable density. They transform a simple base distribution, such as a Gaussian, through a sequence of invertible functions, which are referred to as layers. These layers typically use neural networks to become very expressive. Flows are ubiquitous in machine learning and have been applied to image generation, text modeling, variational inference, approximating Boltzmann distributions, and many other problems. Here, we present normflows, a Python package for normalizing flows. It allows to build normalizing flow models from a suite of base distributions, flow layers, and neural networks. The package is implemented in the popular deep learning framework PyTorch, which simplifies the integration of flows in larger machine learning models or pipelines. It supports most of the common normalizing flow architectures, such as Real NVP, Glow, Masked Autoregressive Flows, Neural Spline Flows, Residual Flows, and many more. The package can be easily installed via pip and the code is publicly available on GitHub.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (27)
  1. Framework for Easily Invertible Architectures (FrEIA), 2018-2022. URL https://github.com/vislearn/FrEIA.
  2. The DeepMind JAX Ecosystem, 2020.
  3. A gradient based strategy for Hamiltonian Monte Carlo hyperparameter optimization. In Proceedings of the 38th International Conference on Machine Learning, pages 1238–1248. PMLR, 2021.
  4. Neural Ordinary Differential Equations. In Advances in Neural Information Processing Systems, volume 31, 2018.
  5. Residual flows for invertible generative modeling. In Advances in Neural Information Processing Systems, volume 32, 2019.
  6. TensorFlow Distributions. arXiv preprint arXiv:1711.10604, 2017.
  7. Density estimation using Real NVP. International Conference on Learning Representations, 2017.
  8. Neural spline flows. Advances in Neural Information Processing Systems, 32:7511–7522, 2019.
  9. nflows: normalizing flows in PyTorch. Zenodo, 2020. URL https://doi.org/10.5281/zenodo.4296287.
  10. Densely connected normalizing flows. In Advances in Neural Information Processing Systems, volume 34, 2021.
  11. Glow: Generative flow with invertible 1x1 convolutions. In Advances in Neural Information Processing Systems, volume 31, 2018.
  12. Normalizing flows: An introduction and review of current methods. IEEE Transactions on Pattern Analysis and Machine Intelligence, 43(11):3964–3979, 2021. doi: 10.1109/TPAMI.2020.2992934.
  13. Flow Annealed Importance Sampling Bootstrap. International Conference on Learning Representations, 2023.
  14. Neural importance sampling. ACM Transactions on Graphics (TOG), 38(5):1–19, 2019.
  15. SurVAE flows: Surjections to bridge the gap between VAEs and flows. Advances in Neural Information Processing Systems 33, 2020.
  16. Boltzmann generators: Sampling equilibrium states of many-body systems with deep learning. Science, 365(6457), 2019.
  17. Masked autoregressive flow for density estimation. In Proceedings of the 31st International Conference on Neural Information Processing Systems, NIPS’17, pages 2335–2344, Red Hook, NY, USA, December 2017. Curran Associates Inc.
  18. Normalizing flows for probabilistic modeling and inference. Journal of Machine Learning Research, 22(57):1–64, 2021.
  19. Pytorch: An imperative style, high-performance deep learning library. In Advances in Neural Information Processing Systems 32, pages 8024–8035. 2019.
  20. Variational inference with normalizing flows. In Proceedings of the 32nd International Conference on Machine Learning, pages 1530–1538. PMLR, 2015.
  21. Normalizing flows on tori and spheres. In Proceedings of the 37th International Conference on Machine Learning, volume 119, pages 8083–8092. PMLR, 2020.
  22. Resampling Base Distributions of Normalizing Flows. In Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, volume 151, pages 4915–4936, 2022.
  23. Implementing Boltzmann generators with normflows. Zenodo, 2023. URL https://doi.org/10.5281/zenodo.7565800.
  24. A family of nonparametric density estimation algorithms. Communications on Pure and Applied Mathematics, 66(2):145–164, 2013.
  25. Density estimation by dual ascent of the log-likelihood. Communications in Mathematical Sciences, 8(1):217–233, 2010. ISSN 15396746, 19450796. doi: 10.4310/CMS.2010.v8.n1.a11.
  26. Riemannian normalizing flow on variational Wasserstein autoencoder for text modeling. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), 2019.
  27. Stochastic normalizing flows. In Advances in Neural Information Processing Systems, volume 33, pages 5933–5944, 2020.
Citations (45)

Summary

We haven't generated a summary for this paper yet.