Reflected Schrödinger Bridge for Constrained Generative Modeling (2401.03228v1)
Abstract: Diffusion models have become the go-to method for large-scale generative models in real-world applications. These applications often involve data distributions confined within bounded domains, typically requiring ad-hoc thresholding techniques for boundary enforcement. Reflected diffusion models (Lou23) aim to enhance generalizability by generating the data distribution through a backward process governed by reflected Brownian motion. However, reflected diffusion models may not easily adapt to diverse domains without the derivation of proper diffeomorphic mappings and do not guarantee optimal transport properties. To overcome these limitations, we introduce the Reflected Schrodinger Bridge algorithm: an entropy-regularized optimal transport approach tailored for generating data within diverse bounded domains. We derive elegant reflected forward-backward stochastic differential equations with Neumann and Robin boundary conditions, extend divergence-based likelihood training to bounded domains, and explore natural connections to entropic optimal transport for the study of approximate linear convergence - a valuable insight for practical training. Our algorithm yields robust generative modeling in diverse domains, and its scalability is demonstrated in real-world constrained generative modeling through standard image benchmarks.
- Partition-guided GANs. In Conference on Computer Vision and Pattern Recognition, 2021.
- Sampling from a Log-concave Distribution with Compact Support with Proximal Langevin Monte Carlo. In COLT, 2017.
- Sampling from a Log-Concave Distribution with Projected Langevin Monte Carlo. Discrete Comput Geom, page 757–783, 2018.
- The Schrödinger Bridge between Gaussian Measures has a Closed Form. In AISTATS, 2023.
- Reflected Schrödinger Bridge: Density Control with Path Constraints. In American Control Conference (ACC), 2021.
- Wasserstein Proximal Algorithms for the Schrödinger Bridge Problem: Density Control with Nonlinear Drift. IEEE Transactions on Automatic Control, 67(3):1163–1178, 2022.
- Guillaume Carlier. On the Linear Convergence of the Multi-Marginal Sinkhorn Algorithm. SIAM Journal on Optimization, 32:2:10.1137, 2022.
- Patrick Cattiaux. Time Reversal of Diffusion Processes with a Boundary Condition. Stochastic Processes and their Applications, 28:275–292, 1988.
- Sampling is as Easy as Learning the Score: Theory for Diffusion Models with Minimal Data Assumptions. arXiv preprint arXiv:2209.11215v2, 2022a.
- Likelihood Training of Schrödinger Bridge using Forward-Backward SDEs Theory. In Proc. of the International Conference on Learning Representation (ICLR), 2022b.
- Provably Convergent Schrödinger Bridge with Applications to Probabilistic Time Series Imputation. In Proc. of the International Conference on Machine Learning (ICML), 2023.
- Stochastic Bridges of Linear Systems. IEEE Transactions on Automatic Control, 61(2), 2016.
- Entropic and Displacement Interpolation: a Computational Approach using the Hilbert Metric. SIAM Journal on Applied Mathematics, 2016.
- Stochastic Control Liaisons: Richard Sinkhorn Meets Gaspard Monge on a Schrödinger Bridge. SIAM Review, 63(2):249–313, 2021.
- Valentin De Bortoli. Convergence of Denoising Diffusion Models under the Manifold Hypothesis. Transactions on Machine Learning Research (TMLR), 2022.
- Diffusion Schrödinger Bridge with Applications to Score-Based Generative Modeling. In Advances in Neural Information Processing Systems (NeurIPS), 2021.
- Riemannian Score-Based Generative Modelling. In Advances in Neural Information Processing Systems (NeurIPS), 2022.
- Quantitative Uniform Stability of the Iterative Proportional Fitting Procedure. Preprint arXiv:2108.08129v2, 2021.
- Interacting contour stochastic gradient langevin dynamics. In Proc. of the International Conference on Learning Representation (ICLR), 2022a.
- An Adaptively Weighted Stochastic Gradient MCMC Algorithm for Monte Carlo Simulation and Global Optimization. Statistics and Computing, pages 32–58, 2022b.
- Diffusion Models Beat GANs on Image Synthesis. In Advances in Neural Information Processing Systems (NeurIPS), 2022.
- Convergence Rates for Regularized Optimal Transport via Quantization. arXiv:2208.14391, 2022.
- Diffusion Models for Constrained Domains. Transactions on Machine Learning Research, 2023.
- Solving High-dimensional Partial Differential Equations using Deep Learning. PNAS, 2019.
- Simulating Diffusion Bridges with Score Matching. In arXiv:2111.07243, 2022.
- Gans Trained by a Two Time-scale Update Rule Converge to a Local Nash Equilibrium. Advances in neural information processing systems, 30, 2017.
- Denoising Diffusion Probabilistic Models. In Advances in Neural Information Processing Systems (NeurIPS), 2020.
- Imagen Video: High Definition Video Generation with Diffusion Models. In arXiv:2210.02303, 2022.
- Equivariant Diffusion for Molecule Generation in 3D. In Proc. of the International Conference on Machine Learning (ICML), 2022.
- Mirrored Langevin Dynamics. In Advances in Neural Information Processing Systems (NeurIPS), 2018.
- Riemannian Diffusion Models. In Advances in Neural Information Processing Systems (NeurIPS), 2022.
- Aapo Hyvärinen. Estimation of Non-normalized Statistical Models by Score Matching. Journal of Machine Learning Research, 6(24):695–709, 2005.
- Brownian Motion and Stochastic Calculus. Springer, 1998.
- Statistical Efficiency of Score Matching: The View from Isoperimetry. In ICLR, 2023.
- Sampling with Riemannian Hamiltonian Monte Carlo in a Constrained Space. In NeurIPS, 2022.
- S. Kullback. Probability Densities with Given Marginals. Ann. Math. Statist., 1968.
- Andrew Lamperski. Projected Stochastic Gradient Langevin Algorithms for Constrained Sampling and Non-Convex Learning. In Proc. of Conference on Learning Theory (COLT), 2021.
- The Flow Map of the Fokker–Planck Equation Does Not Provide Optimal Transport. Applied Mathematics Letters, 133, 2022.
- Christian Léonard. Minimization of Energy Functionals Applied to Some Inverse Problems. Applied Mathematics and Optimization volume, 44:273–297, 2001.
- Christian Léonard. A Survey of the Schrödinger Problem and Some of its Connections with Optimal Transport. Discrete & Continuous Dynamical Systems-A, 34(4):1533–1574, 2014.
- Composing Ensembles of Pre-trained Models via Iterative Consensus. ICLR, 2023.
- Stochastic Differential Equations with Reflecting Boundary Conditions. Communications on Pure and Applied Mathematics, pages 511–537, 1984.
- Mirror Diffusion Models for Constrained and Watermarked Generation. In Advances in Neural Information Processing Systems (NeurIPS), 2023a.
- Learning Diffusion Bridges on Constrained Domains. In Proc. of the International Conference on Learning Representation (ICLR), 2023b.
- Reflected Diffusion Models. In Proc. of the International Conference on Machine Learning (ICML), 2023.
- DPM-Solver: A Fast ODE Solver for Diffusion Probabilistic Model Sampling in Around 10 Steps. In Advances in Neural Information Processing Systems (NeurIPS), 2022.
- Forward-Backward Stochastic Differential Equations and their Applications. Springer, 2007.
- S. Di Marino and A. Gerolin. An Optimal Transport Approach for the Schrödinger Bridge Problem and Convergence of Sinkhorn Algorithm. Journal of Scientific Computing, 85:2:27–28, 2020.
- Marcel Nutz. Introduction to Entropic Optimal Transport. Lecture Notes, 2022.
- Stability of Schrödinger Potentials and Convergence of Sinkhorn’s Algorithm. Annals of Probability, 2022.
- B. Øksendal. Stochastic Differential Equations: An Introduction with Applications. Springer, 2003.
- Generative Residual Block for Image Generation. Applied Intelligence, pages 1–10, 2022.
- Grigorios A. Pavliotis. Stochastic Processes and Applications: Diffusion Processes, the Fokker-Planck and Langevin Equations. Springer, 2014.
- The Data-driven Schrödinger Bridge. Communications on Pure and Applied Mathematics, 74:1545–1573, 2021.
- Stefano Peluchetti. Diffusion Bridge Mixture Transports, Schrödinger Bridge Problems and Generative Modeling. ArXiv e-prints arXiv:2304.00917v1, 2023.
- Computational Optimal Transport: With Applications to Data Science. Foundations and Trends in Machine Learning, 2019.
- L. Ruschendorf. Convergence of the Iterative Proportional Fitting Procedure. Ann. of Statistics, 1995.
- Progressive Distillation for Fast Sampling of Diffusion Models. In Proc. of the International Conference on Learning Representation (ICLR), 2022.
- Diffusion Schrödinger Bridge Matching. In arXiv:2303.16852v1, 2023.
- A. V. Skorokhod. Stochastic Equations for Diffusion Processes in a Bounded Region. Theory of Probability & Its Applications, page 264–274, 1961.
- Maximum Likelihood Training of Score-Based Diffusion Models . In Advances in Neural Information Processing Systems (NeurIPS), 2021a.
- Score-Based Generative Modeling through Stochastic Differential Equations . In Proc. of the International Conference on Learning Representation (ICLR), 2021b.
- Riemannian Diffusion Schrödinger Bridge. arXiv:2207.03024v1, 2022.
- Score-based Generative Modeling in Latent Space. Advances in Neural Information Processing Systems, 34:11287–11302, 2021.
- Solving Schrödinger Bridges via Maximum Likelihood. Entropy, 23(9):1134, 2021.
- Cédric Villani. Topics in Optimal Transportation, volume 58. American Mathematical Soc., 2003.
- Efficient MCMC Sampling with Dimension-Free Convergence Rate using ADMM-type Splitting. Journal of Machine Learning Research, 2022.
- R. J. Williams. On Time-Reversal of Reflected Brownian Motions. Part of the Progress in Probability and Statistics book series, 1987.
- Wei Deng (65 papers)
- Yu Chen (506 papers)
- Nicole Tianjiao Yang (6 papers)
- Hengrong Du (19 papers)
- Qi Feng (75 papers)
- Ricky T. Q. Chen (54 papers)