Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
126 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Identifying latent distances with Finslerian geometry (2212.10010v2)

Published 20 Dec 2022 in cs.LG

Abstract: Riemannian geometry provides us with powerful tools to explore the latent space of generative models while preserving the underlying structure of the data. The latent space can be equipped it with a Riemannian metric, pulled back from the data manifold. With this metric, we can systematically navigate the space relying on geodesics defined as the shortest curves between two points. Generative models are often stochastic, causing the data space, the Riemannian metric, and the geodesics, to be stochastic as well. Stochastic objects are at best impractical, and at worst impossible, to manipulate. A common solution is to approximate the stochastic pullback metric by its expectation. But the geodesics derived from this expected Riemannian metric do not correspond to the expected length-minimising curves. In this work, we propose another metric whose geodesics explicitly minimise the expected length of the pullback metric. We show this metric defines a Finsler metric, and we compare it with the expected Riemannian metric. In high dimensions, we prove that both metrics converge to each other at a rate of $O\left(\frac{1}{D}\right)$. This convergence implies that the established expected Riemannian metric is an accurate approximation of the theoretically more grounded Finsler metric. This provides justification for using the expected Riemannian metric for practical implementations.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (35)
  1. Grandprix: scaling up the bayesian gplvm for single-cell data. Bioinformatics, 35(1):47–54, 2019.
  2. Latent Space Oddity: on the Curvature of Deep Generative Models. In International Conference on Learning Representations (ICLR), 2018.
  3. Pulling back information geometry, 2021. URL https://arxiv.org/abs/2106.05367.
  4. Learning riemannian manifolds for geodesic motion skills. arXiv preprint arXiv:2106.04315, 2021.
  5. Pyro: Deep universal probabilistic programming. The Journal of Machine Learning Research, 20(1):973–978, 2019.
  6. Neill D. F. Campbell and Jan Kautz. Learning a manifold of fonts. ACM Trans. Graph., 33(4), jul 2014. ISSN 0730-0301. doi: 10.1145/2601097.2601212. URL https://doi.org/10.1145/2601097.2601212.
  7. Stochman. GitHub. Note: https://github.com/MachineLearningLifeScience/stochman/, 2021.
  8. Learning meaningful representations of protein sequences. Nature communications, 13(1):1914, 2022.
  9. Expected path length on random manifolds. arXiv preprint arXiv:1908.07377, 2019.
  10. Testing the manifold hypothesis. Journal of the American Mathematical Society, 29(4):983–1049, 2016.
  11. Paul Finsler. Ueber kurven und Flächen in allgemeinen Räumen. Philos. Fak., Georg-August-Univ., 1918.
  12. Mario plays on a manifold: Generating functional content in latent space through differential geometry. In 2022 IEEE Conference on Games (CoG), pp.  385–392. IEEE, 2022.
  13. Resolution of cell fate decisions revealed by single-cell gene expression analysis from zygote to blastocyst. Developmental Cell, 18(4):675–685, 2010. ISSN 1534-5807. doi: https://doi.org/10.1016/j.devcel.2010.02.012. URL https://www.sciencedirect.com/science/article/pii/S1534580710001103.
  14. Søren Hauberg. Only bayes should learn a manifold (on the estimation of differential geometric structure from data). arXiv preprint arXiv:1806.04994, 2018a.
  15. Søren Hauberg. The non-central Nakagami distribution. Technical report, Technical University of Denmark, 2018b. URL http://www2.compute.dtu.dk/~sohau/papers/nakagami2018/nakagami.pdf.
  16. Isometric gaussian process latent variable model for dissimilarity data. In International Conference on Machine Learning, pp. 5127–5136. PMLR, 2021.
  17. Aspects of Multivariate Statistical Theory. The Statistician, 1984. ISSN 00390526. doi: 10.2307/2987858.
  18. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980, 2014.
  19. Neil Lawrence. Gaussian process latent variable models for visualisation of high dimensional data. Advances in neural information processing systems, 16, 2003.
  20. Yann LeCun. The mnist database of handwritten digits. http://yann. lecun. com/exdb/mnist/, 1998.
  21. John M Lee. Smooth manifolds. In Introduction to smooth manifolds, pp.  1–31. Springer, 2013.
  22. Sharpening jensen’s inequality. The American Statistician, 73(3):278–281, 2019. doi: 10.1080/00031305.2017.1419145.
  23. Symmetric spaces for graph embeddings: A finsler-riemannian approach. In Marina Meila and Tong Zhang (eds.), Proceedings of the 38th International Conference on Machine Learning, volume 139 of Proceedings of Machine Learning Research, pp.  7090–7101. PMLR, 18–24 Jul 2021. URL https://proceedings.mlr.press/v139/lopez21a.html.
  24. Steen Markvorsen. A finsler geodesic spray paradigm for wildfire spread modelling. Nonlinear Analysis: Real World Applications, 28:208–228, 2016.
  25. Revisiting active sets for gaussian process decoders. Advances in Neural Information Processing Systems, 35:6603–6614, 2022.
  26. Minoru Nakagami. The m-distribution—a general formula of intensity distribution of rapid fading. In Statistical methods in radio wave propagation, pp.  3–36. Elsevier, 1960.
  27. Pyro. Gaussian process latent variable model, 2022. URL https://pyro.ai/examples/gplvm.html.
  28. Carl Edward Rasmussen and Christopher K. I. Williams. Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning). The MIT Press, 2005. ISBN 026218253X.
  29. Generalized nonlinear and finsler geometry for robotics. In 2021 IEEE International Conference on Robotics and Automation (ICRA), pp.  10206–10212, 2021. doi: 10.1109/ICRA48506.2021.9561543.
  30. Trajectory optimisation in learned multimodal dynamical systems via latent-ode collocation. In 2021 IEEE International Conference on Robotics and Automation (ICRA), pp.  12745–12751. IEEE, 2021.
  31. Introduction to Modern Finsler Geometry. Co-published with HEP, 2016. doi: 10.1142/9726. URL https://www.worldscientific.com/doi/abs/10.1142/9726.
  32. Metrics for probabilistic geometries. arXiv preprint arXiv:1411.7432, 2014.
  33. J. G. Wendel. Note on the gamma function. The American Mathematical Monthly, 55(9):563–564, 1948. ISSN 00029890, 19300972. URL http://www.jstor.org/stable/2304460.
  34. Bingye Wu. Volume form and its applications in finsler geometry. Publicationes Mathematicae, 78, 04 2011. doi: 10.5486/PMD.2011.4998.
  35. Fashion-mnist: a novel image dataset for benchmarking machine learning algorithms. arXiv preprint arXiv:1708.07747, 2017.
Citations (1)

Summary

We haven't generated a summary for this paper yet.