PaddingFlow: Improving Normalizing Flows with Padding-Dimensional Noise (2403.08216v2)
Abstract: Normalizing flow is a generative modeling approach with efficient sampling. However, Flow-based models suffer two issues: 1) If the target distribution is manifold, due to the unmatch between the dimensions of the latent target distribution and the data distribution, flow-based models might perform badly. 2) Discrete data might make flow-based models collapse into a degenerate mixture of point masses. To sidestep such two issues, we propose PaddingFlow, a novel dequantization method, which improves normalizing flows with padding-dimensional noise. To implement PaddingFlow, only the dimension of normalizing flows needs to be modified. Thus, our method is easy to implement and computationally cheap. Moreover, the padding-dimensional noise is only added to the padding dimension, which means PaddingFlow can dequantize without changing data distributions. Implementing existing dequantization methods needs to change data distributions, which might degrade performance. We validate our method on the main benchmarks of unconditional density estimation, including five tabular datasets and four image datasets for Variational Autoencoder (VAE) models, and the Inverse Kinematics (IK) experiments which are conditional density estimation. The results show that PaddingFlow can perform better in all experiments in this paper, which means PaddingFlow is widely suitable for various tasks. The code is available at: https://github.com/AdamQLMeng/PaddingFlow.
- H. Kim, H. Lee, W. H. Kang, J. Y. Lee, and N. S. Kim, “SoftFlow: Probabilistic Framework for Normalizing Flow on Manifolds,” in Advances in Neural Information Processing Systems, 2020.
- B Uria, I Murray, and H Larochelle, “RNADE: The real-valued neural autoregressive density-estimator,” in Advances in Neural Information Processing Systems, 2013.
- J. Ho, X. Chen, A. Srinivas, Y. Duan, and P. Abbeel, “Flow++: Improving Flow-Based Generative Models with Variational Dequantization and Architecture Design,” in Proceedings of the 36th International Conference on Machine Learning, 2019.
- G. Papamakarios, E. Nalisnick, D. J. Rezende, S. Mohamed, and B. Lakshminarayanan, “Normalizing Flows for Probabilistic Modeling and Inference,” in Journal of Machine Learning Research, vol. 22, no. 57, pp. 1-64, 2021.
- R. T. Q. Chen, Y. Rubanova, J. Bettencourt, and D. Duvenaud, “Neural ordinary differential equations,” in Advances in Neural Information Processing Systems, 2018.
- D. P. Kingma, and M. Welling, “Auto-Encoding Variational Bayes,” in arXiv preprint arXiv:1312.6114, 2022.
- G. Papamakarios, T. Pavlakou, and I. Murray, “Masked Autoregressive Flow for Density Estimation,” in arXiv preprint arXiv:1705.07057, 2017.
- R. V. D. Berg, L. Hasenclever, J. M. Tomczak, and M. Welling, “Sylvester Normalizing Flows for Variational Inference,” in arXiv preprint arXiv:1803.05649, 2018.
- P. Achlioptas, O. Diamanti, I. Mitliagkas, and L. Guibas, “Learning representations and generative models for 3d point clouds,” in Proceedings of the 35th International Conference on Machine Learning, 2018.