Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Tangential Fixpoint Iterations for Gromov-Wasserstein Barycenters (2403.08612v1)

Published 13 Mar 2024 in math.NA, cs.NA, and math.OC

Abstract: The Gromov-Wasserstein (GW) transport problem is a relaxation of classic optimal transport, which seeks a transport between two measures while preserving their internal geometry. Due to meeting this theoretical underpinning, it is a valuable tool for the analysis of objects that do not possess a natural embedding or should be studied independently of it. Prime applications can thus be found in e.g. shape matching, classification and interpolation tasks. To tackle the latter, one theoretically justified approach is the employment of multi-marginal GW transport and GW barycenters, which are Fr\'echet means with respect to the GW distance. However, because the computation of GW itself already poses a quadratic and non-convex optimization problem, the determination of GW barycenters is a hard task and algorithms for their computation are scarce. In this paper, we revisit a known procedure for the determination of Fr\'echet means in Riemannian manifolds via tangential approximations in the context of GW. We provide a characterization of barycenters in the GW tangent space, which ultimately gives rise to a fixpoint iteration for approximating GW barycenters using multi-marginal plans. We propose a relaxation of this fixpoint iteration and show that it monotonously decreases the barycenter loss. In certain cases our proposed method naturally provides us with barycentric embeddings. The resulting algorithm is capable of producing qualitative shape interpolations between multiple 3d shapes with support sizes of over thousands of points in reasonable time. In addition, we verify our method on shape classification and multi-graph matching tasks.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (66)
  1. M. Agueh and G. Carlier. Barycenters in the Wasserstein space. SIAM J. Math. Anal., 43(2):904–924, 2011.
  2. Neural Wasserstein gradient flows for maximum mean discrepancies with Riesz kernels. arXiv:2301.11624, 2023.
  3. A fixed-point approach to barycenters in Wasserstein space. J. Math. Anal. Appl., 441(2):744–762, 2016.
  4. D. Alvarez-Melis and T. S. Jaakkola. Gromov–Wasserstein alignment of word embedding spaces. arXiv:1809.00013, 2018.
  5. Lectures on Optimal Transport. Number 130 in Unitext. Springer, Cham, 2021.
  6. Gradient Flows in Metric Spaces and in the Space of Probability Measures. Birkhäuser, Basel, 2005.
  7. Maximum mean discrepancy gradient flow. In H. Wallach, H. Larochelle, A. Beygelzimer, F. d Alché-Buc, E. Fox, and R. Garnett, editors, Advances in Neural Information Processing Systems, volume 32, pages 1–11, New York, USA, 2019. Curran Associates Inc.
  8. Wasserstein generative adversarial networks. In Proc. of Machine Learning, volume 70, pages 214–223. PMLR, 2017.
  9. F. Beier. Gromov–Wasserstein transfer operators. In International Conference on Scale Space and Variational Methods in Computer Vision, pages 614–626. Springer, 2023.
  10. On a linear Gromov–Wasserstein distance. IEEE Trans. Image Process., 31:7292–7305, 2022.
  11. Multi-marginal Gromov–Wasserstein transport and barycentres. Inf. Inference, 12(4):2753–2781, 10 2023.
  12. Unbalanced multi-marginal optimal transport. J. Math. Imaging Vis., 65(3):394–413, 2023.
  13. On assignment problems related to Gromov–Wasserstein distances on the real line. SIAM J. Imaging Sci., 16(2):1028–1032, 2023.
  14. FAUST: Dataset and evaluation for 3D mesh registration. In Proceedings IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), Piscataway, NJ, USA, June 2014. IEEE.
  15. G. Carlier and I. Ekeland. Matching for teams. Econ. Theory, 42:397–418, 02 2010.
  16. S. Chowdhury and T. Needham. Gromov-Wasserstein averaging in a Riemannian framework. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, pages 842–843, 2020.
  17. Toward a comprehensive atlas of the physical interactome of saccharomyces cerevisiae. Mol. Cell. Proteom., 6(3):439–450, 2007.
  18. M. Cuturi. Sinkhorn distances: lightspeed computation of optimal transport. In Advances in Neural Information Processing Systems 26, pages 2292–2300. Curran Associates, Inc., 2013.
  19. Trimesh, 2019. https://trimesh.org/, version 3.2.0.
  20. J. Delon and A. Desolneux. A Wasserstein-type distance in the space of Gaussian mixture models. SIAM J. Imaging Sci., 13(2):936–970, 2020.
  21. Gromov–Wasserstein distances between Gaussian distributions. J. Appl. Probab., 59(4):1178–1198, 2022.
  22. Curve based approximation of measures on manifolds by discrepancy minimization. Found. Comp. Math., 21(6):1595–1642, 2021.
  23. Multi-marginal optimal transport using partial information with applications in robust localization and sensor fusion. Signal Process., 171:107474, 2020.
  24. POT: Python optimal transport. J. Mach. Learn. Res., 22(78):1–8, 2021.
  25. Statistical optimal transport via factored couplings. In The 22nd International Conference on Artificial Intelligence and Statistics, pages 2454–2465. PMLR, 2019.
  26. G. Friesecke and M. Penka. The GenCol algorithm for high-dimensional optimal transport: general formulation and application to barycenters and Wasserstein splines. SIAM J. Math. Data Sci., 5(4):899–919, 2023.
  27. Learning with a Wasserstein loss. In Advances in Neural Information Processing Systems 28, pages 2053–2061. Curran Associates, Inc., 2015.
  28. Exploring network structure, dynamics, and function using NetworkX. In G. Varoquaux, T. Vaught, and J. Millman, editors, Proceedings of the 7th Python in Science Conference, pages 11–15, 2008.
  29. Posterior sampling based on gradient flows of the MMD with negative distance kernel. arXiv:2310.03054, 2024.
  30. Wasserstein gradient flows of the discrepancy with distance kernel on the line. In International Conference on Scale Space and Variational Methods in Computer Vision, pages 431–443. Springer, 2023.
  31. Wasserstein steepest descent flows of discrepancies with Riesz kernels. J. Math. Anal. Appl., 531(1):127829, 2024.
  32. The variational formulation of the Fokker–Planck equation. SIAM J. Math. Anal., 29(1):1–17, 1998.
  33. Entropic transfer operators. arXiv:2204.04901, 2022.
  34. Transfer operators from optimal transport plans for coherent set detection. Phys. D, 426:132980, 2021.
  35. From word embeddings to document distances. In Proc. of Machine Learning Research, volume 37, pages 957–966. PMLR, 2015.
  36. Entropic Gromov-Wasserstein between Gaussian distributions. In K. Chaudhuri, S. Jegelka, L. Song, C. Szepesvari, G. Niu, and S. Sabato, editors, Proceedings of the 39th International Conference on Machine Learning, volume 162 of Proceedings of Machine Learning Research, pages 12164–12203. PMLR, 17–23 Jul 2022.
  37. J. M. Lee. Introduction to Riemannian manifolds, volume 2. Springer, 2018.
  38. F. Mémoli. Gromov–Wasserstein distances and the metric approach to object matching. Found. Comput. Math., 11(4):417–487, 2011.
  39. C. Moosmüller and A. Cloninger. Linear optimal transport embedding: provable Wasserstein classification for certain rigid transformations and perturbations. Inf. Inference, 12(1):363–389, 2023.
  40. Wasserstein gradient flows for Moreau envelopes of f𝑓fitalic_f-divergences in reproducing kernel Hilbert spaces. arXiv:2402.04613, 2024.
  41. D. H. Nguyen and K. Tsuda. On a linear fused Gromov–Wasserstein distance for graph structured data. Pattern Recognit., 138:109351, 2023.
  42. F. Otto. The geometry of dissipative evolution equations: the porous medium equation. Comm. Partial Differential Equations, 26:101–174, 2001.
  43. B. Pass. Multi-marginal optimal transport: theory and applications. ESAIM Math. Model. Numer. Anal., 49(6):1771–1790, 2015.
  44. G. A. Pavliotis. Stochastic Processes and Applications: Diffusion Processes, the Fokker–Planck and Langevin Equations. Number 60 in Texts in Applied Mathematics. Springer, New York, 2014.
  45. X. Pennec. Intrinsic statistics on Riemannian manifolds: basic tools for geometric measurements. J. Math. Imaging Vis., 25(1):127–154, 2006.
  46. G. Peyré and M. Cuturi. Computational optimal transport: with applications to data science. Found. Trends Mach. Learn., 11(5-6):355–607, 2019.
  47. Gromov-Wasserstein averaging of kernel and distance matrices. In International Conference on Machine Learning, pages 2664–2672, 2016.
  48. Sliced optimal transport on the sphere. Inverse Probl., 39(10):105005, 2023.
  49. Parallelly sliced optimal transport on spheres and on the rotation group. arXiv:2401.16896, 2024.
  50. Wasserstein barycenter and its application to texture mixing. In International Conference on Scale Space and Variational Methods in Computer Vision, pages 435–446. Springer, 2011.
  51. Globally solving the Gromov-Wasserstein problem for point clouds in low dimensional Euclidean spaces. In A. Oh, T. Neumann, A. Globerson, K. Saenko, M. Hardt, and S. Levine, editors, Advances in Neural Information Processing Systems, volume 36, pages 7930–7946. Curran Associates, Inc., 2023.
  52. F. Santambrogio. {{\{{Euclidean, metric, and Wasserstein}}\}} gradient flows: an overview. Bull. Math. Sci., 7(1):87–154, 2017.
  53. V. Saraph and T. Milenković. MAGNA: maximizing accuracy in global network alignment. Bioinform., 30(20):2931–2940, 07 2014.
  54. Low-rank Sinkhorn factorization. In International Conference on Machine Learning, pages 9344–9354. PMLR, 2021.
  55. Linear-time Gromov Wasserstein distances using low rank couplings and costs. In International Conference on Machine Learning, pages 19347–19365. PMLR, 2022.
  56. K.-T. Sturm. On the geometry of metric measure spaces. Acta Math., 196(1):65–131, 2006.
  57. K.-T. Sturm. The space of spaces: curvature bounds and gradient flows on the space of metric measure spaces. arXiv:1208.0434, 2020.
  58. R. W. Sumner and J. Popovic. Mesh data from deformation transfer for triangle meshes. http://people.csail.mit.edu/sumner/research/deftransfer/data.html.
  59. Sliced Gromov-Wasserstein. In H. Wallach, H. Larochelle, A. Beygelzimer, F. Alché-Buc, E. Fox, and R. Garnett, editors, Advances in Neural Information Processing Systems, volume 32. Curran Associates, Inc., 2019.
  60. T. Vayer. A contribution to optimal transport on incomparable spaces. PhD Thesis, Université Bretagne, 2020.
  61. Magna++: maximizing accuracy in global network alignment via both node and edge conservation. Bioinform., 31(14):2409–2411, 03 2015.
  62. C. Villani. Optimal Transport: Old and New, volume 338. Springer, 2008.
  63. J. von Lindheim. Simple approximative algorithms for free-support Wasserstein barycenters. Comput. Optim. Appl., 85(1):213–246, 2023.
  64. A linear optimal transportation framework for quantifying and visualizing variations in sets of images. Int. J. Comput. Vis., 101(2):254–269, 2013.
  65. Scalable Gromov–Wasserstein learning for graph partitioning and matching. Advances in neural information processing systems, 32, 2019.
  66. An image morphing technique based on optimal mass preserving mapping. IEEE Trans. Image Process., 16:1481–95, 2007.

Summary

We haven't generated a summary for this paper yet.