Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

PaddingFlow: Improving Normalizing Flows with Padding-Dimensional Noise (2403.08216v2)

Published 13 Mar 2024 in cs.LG and cs.CV

Abstract: Normalizing flow is a generative modeling approach with efficient sampling. However, Flow-based models suffer two issues: 1) If the target distribution is manifold, due to the unmatch between the dimensions of the latent target distribution and the data distribution, flow-based models might perform badly. 2) Discrete data might make flow-based models collapse into a degenerate mixture of point masses. To sidestep such two issues, we propose PaddingFlow, a novel dequantization method, which improves normalizing flows with padding-dimensional noise. To implement PaddingFlow, only the dimension of normalizing flows needs to be modified. Thus, our method is easy to implement and computationally cheap. Moreover, the padding-dimensional noise is only added to the padding dimension, which means PaddingFlow can dequantize without changing data distributions. Implementing existing dequantization methods needs to change data distributions, which might degrade performance. We validate our method on the main benchmarks of unconditional density estimation, including five tabular datasets and four image datasets for Variational Autoencoder (VAE) models, and the Inverse Kinematics (IK) experiments which are conditional density estimation. The results show that PaddingFlow can perform better in all experiments in this paper, which means PaddingFlow is widely suitable for various tasks. The code is available at: https://github.com/AdamQLMeng/PaddingFlow.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (9)
  1. H. Kim, H. Lee, W. H. Kang, J. Y. Lee, and N. S. Kim, “SoftFlow: Probabilistic Framework for Normalizing Flow on Manifolds,” in Advances in Neural Information Processing Systems, 2020.
  2. B Uria, I Murray, and H Larochelle, “RNADE: The real-valued neural autoregressive density-estimator,” in Advances in Neural Information Processing Systems, 2013.
  3. J. Ho, X. Chen, A. Srinivas, Y. Duan, and P. Abbeel, “Flow++: Improving Flow-Based Generative Models with Variational Dequantization and Architecture Design,” in Proceedings of the 36th International Conference on Machine Learning, 2019.
  4. G. Papamakarios, E. Nalisnick, D. J. Rezende, S. Mohamed, and B. Lakshminarayanan, “Normalizing Flows for Probabilistic Modeling and Inference,” in Journal of Machine Learning Research, vol. 22, no. 57, pp. 1-64, 2021.
  5. R. T. Q. Chen, Y. Rubanova, J. Bettencourt, and D. Duvenaud, “Neural ordinary differential equations,” in Advances in Neural Information Processing Systems, 2018.
  6. D. P. Kingma, and M. Welling, “Auto-Encoding Variational Bayes,” in arXiv preprint arXiv:1312.6114, 2022.
  7. G. Papamakarios, T. Pavlakou, and I. Murray, “Masked Autoregressive Flow for Density Estimation,” in arXiv preprint arXiv:1705.07057, 2017.
  8. R. V. D. Berg, L. Hasenclever, J. M. Tomczak, and M. Welling, “Sylvester Normalizing Flows for Variational Inference,” in arXiv preprint arXiv:1803.05649, 2018.
  9. P. Achlioptas, O. Diamanti, I. Mitliagkas, and L. Guibas, “Learning representations and generative models for 3d point clouds,” in Proceedings of the 35th International Conference on Machine Learning, 2018.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets