Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Dynamic Conditional Optimal Transport through Simulation-Free Flows (2404.04240v2)

Published 5 Apr 2024 in cs.LG

Abstract: We study the geometry of conditional optimal transport (COT) and prove a dynamical formulation which generalizes the Benamou-Brenier Theorem. Equipped with these tools, we propose a simulation-free flow-based method for conditional generative modeling. Our method couples an arbitrary source distribution to a specified target distribution through a triangular COT plan, and a conditional generative model is obtained by approximating the geodesic path of measures induced by this COT plan. Our theory and methods are applicable in infinite-dimensional settings, making them well suited for a wide class of Bayesian inverse problems. Empirically, we demonstrate that our method is competitive on several challenging conditional generation tasks, including an infinite-dimensional inverse problem.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (64)
  1. Building normalizing flows with stochastic interpolants. arXiv preprint arXiv:2209.15571, 2022.
  2. Stochastic interpolants: A unifying framework for flows and diffusions. arXiv preprint arXiv:2303.08797, 2023a.
  3. Stochastic interpolants with data-dependent couplings. arXiv preprint arXiv:2310.03725, 2023b.
  4. Gradient flows: in metric spaces and in the space of probability measures. Springer Science & Business Media, 2005.
  5. A user’s guide to optimal transport. Modelling and Optimisation of Flows on Networks: Cetraro, Italy 2009, Editors: Benedetto Piccoli, Michel Rascle, pages 1–155, 2013.
  6. Brandon Amos et al. Tutorial on amortized optimization. Foundations and Trends in Machine Learning, 16(5):592–732, 2023.
  7. Conditional score-based diffusion models for Bayesian inference in infinite dimensions. Advances in Neural Information Processing Systems, 36, 2024.
  8. Conditional sampling with monotone GANs: from generative models to likelihood-free inference. arXiv preprint arXiv:2006.06755, 2020.
  9. Understanding the training of infinitely deep and wide ResNets with conditional optimal transport. arXiv preprint arXiv:2403.12887, 2024.
  10. A computational fluid mechanics solution to the Monge-Kantorovich mass transfer problem. Numerische Mathematik, 84(3):375–393, 2000.
  11. MCMC methods for diffusion bridges. Stochastics and Dynamics, 8(03):319–350, 2008.
  12. Hybrid Monte Carlo on Hilbert spaces. Stochastic Processes and their Applications, 121(10):2201–2230, 2011.
  13. Vladimir Igorevich Bogachev and Maria Aparecida Soares Ruas. Measure Theory, volume 2. Springer, 2007.
  14. Supervised training of conditional monge maps. Advances in Neural Information Processing Systems, 35:6859–6872, 2022a.
  15. Proximal optimal transport modeling of population dynamics. In International Conference on Artificial Intelligence and Statistics, pages 6511–6528. PMLR, 2022b.
  16. Vector quantile regression: An optimal transport approach. The Annals of Statistics, 44(3):1165 – 1192, 2016. doi: 10.1214/15-AOS1401. URL https://doi.org/10.1214/15-AOS1401.
  17. Conditional Wasserstein distances with applications in Bayesian OT flow matching. arXiv preprint arXiv:2403.18705, 2024.
  18. Riemannian flow matching on general geometries. arXiv preprint arXiv:2302.03660, 2023.
  19. Stargan v2: Diverse image synthesis for multiple domains. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 8188–8197, 2020.
  20. MCMC methods for functions: Modifying old algorithms to make them faster. Statistical Science, 28(3):424 – 446, 2013.
  21. The frontier of simulation-based inference. Proceedings of the National Academy of Sciences, 117(48):30055–30062, 2020.
  22. Stochastic Equations in Infinite Dimensions. Cambridge University Press, 2014.
  23. Flow matching in latent space. arXiv preprint arXiv:2307.08698, 2023.
  24. The Bayesian approach to inverse problems. arXiv preprint arXiv:1302.6989, 2013.
  25. Efficient video prediction via sparsely conditioned flow matching. In Proceedings of the IEEE/CVF International Conference on Computer Vision, pages 23263–23274, 2023.
  26. How to train your neural ODE: the world of Jacobian and kinetic regularization. In International Conference on Machine Learning, pages 3154–3164. PMLR, 2020.
  27. POT: Python optimal transport. Journal of Machine Learning Research, 22(78):1–8, 2021. URL http://jmlr.org/papers/v22/20-451.html.
  28. Inferring atmospheric properties of exoplanets with flow matching and neural importance sampling. arXiv preprint arXiv:2312.08295, 2023.
  29. Generative adversarial nets. Advances in Neural Information Processing Systems, 27, 2014.
  30. Denoising diffusion probabilistic models. Advances in Neural Information Processing Systems, 33:6840–6851, 2020.
  31. Conditional optimal transport on function spaces. arXiv preprint arXiv:2311.05672, 2023.
  32. Manifold interpolating optimal-transport flows for trajectory inference. Advances in Neural Information Processing Systems, 35:29705–29718, 2022.
  33. Extended flow matching: a method of conditional generation with generalized continuity equation. arXiv preprint arXiv:2402.18839, 2024.
  34. Image-to-image translation with conditional adversarial networks. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2017.
  35. Diffusion generative models in infinite dimensions. arXiv preprint arXiv:2212.00886, 2022.
  36. Functional flow matching. arXiv preprint arXiv:2305.17209, 2023.
  37. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980, 2014.
  38. Auto-encoding variational Bayes. arXiv preprint arXiv:1312.6114, 2013.
  39. Neural optimal transport. arXiv preprint arXiv:2201.12220, 2022.
  40. Minimizing trajectory curvature of ode-based generative models. In International Conference on Machine Learning, pages 18957–18973. PMLR, 2023.
  41. Fourier neural operator for parametric partial differential equations. arXiv preprint arXiv:2010.08895, 2020.
  42. Score-based diffusion models in function space. arXiv preprint arXiv:2302.07400, 2023.
  43. Flow matching for generative modeling. In The Eleventh International Conference on Learning Representations, 2022.
  44. Unsupervised image-to-image translation networks. Advances in Neural Information Processing Systems, 30, 2017.
  45. Flow straight and fast: Learning to generate and transfer data with rectified flow. arXiv preprint arXiv:2209.03003, 2022.
  46. Optimal transport mapping via input convex neural networks. In International Conference on Machine Learning, pages 6672–6681. PMLR, 2020.
  47. Robert J McCann. A convexity principle for interacting gases. Advances in Mathematics, 128(1):153–179, 1997.
  48. K Nazarpour and M Chen. Handwritten Chinese Numbers. 1 2017. doi: 10.17634/137930-3. URL https://data.ncl.ac.uk/articles/dataset/Handwritten_Chinese_Numbers/10280831.
  49. A computational framework for solving Wasserstein Lagrangian flows. arXiv preprint arXiv:2310.10649, 2023.
  50. Ot-flow: Fast and accurate continuous normalizing flows via optimal transport. In Proceedings of the AAAI Conference on Artificial Intelligence, pages 9223–9232, 2021.
  51. Image-to-image translation: Methods and applications. IEEE Transactions on Multimedia, 24:3859–3881, 2021.
  52. Normalizing flows for probabilistic modeling and inference. Journal of Machine Learning Research, 22(57):1–64, 2021.
  53. Multisample flow matching: Straightening flows with minibatch couplings. arXiv preprint arXiv:2304.14772, 2023a.
  54. Neural optimal transport with lagrangian costs. In ICML Workshop on New Frontiers in Learning, Control, and Dynamical Systems, 2023b.
  55. Learning transferable visual models from natural language supervision. In International Conference on Machine Learning, pages 8748–8763. PMLR, 2021.
  56. U-net: Convolutional networks for biomedical image segmentation. In Medical image computing and computer-assisted intervention–MICCAI 2015: 18th international conference, Munich, Germany, October 5-9, 2015, proceedings, part III 18, pages 234–241. Springer, 2015.
  57. Filippo Santambrogio. Optimal transport for applied mathematicians. Birkäuser, NY, 55(58-63):94, 2015.
  58. Score-based generative modeling through stochastic differential equations. arXiv preprint arXiv:2011.13456, 2020.
  59. 2-Wasserstein approximation via restricted convex potentials with application to improved training for GANs. arXiv preprint arXiv:1902.07197, 2019.
  60. Trajectorynet: A dynamic optimal transport network for modeling cellular dynamics. In International Conference on Machine Learning, pages 9526–9536. PMLR, 2020.
  61. Improving and generalizing flow-based generative models with minibatch optimal transport. In ICML Workshop on New Frontiers in Learning, Control, and Dynamical Systems, 2023.
  62. Cédric Villani et al. Optimal Transport: Old and New, volume 338. Springer, 2009.
  63. Efficient neural network approaches for conditional optimal transport with applications in Bayesian inference. arXiv preprint arXiv:2310.16975, 2023.
  64. Unpaired image-to-image translation using cycle-consistent adversarial networks. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 2223–2232, 2017.
Citations (6)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com