Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Bundle-based similarity measurement for positive semidefinite matrices (2312.13721v1)

Published 21 Dec 2023 in math.NA and cs.NA

Abstract: Positive semidefinite (PSD) matrices are indispensable in many fields of science. A similarity measurement for such matrices is usually an essential ingredient in the mathematical modelling of a scientific problem. This paper proposes a unified framework to construct similarity measurements for PSD matrices. The framework is obtained by exploring the fiber bundle structure of the cone of PSD matrices and generalizing the idea of the point-set distance previously developed for linear subsapces and positive definite (PD) matrices. The framework demonstrates both theoretical advantages and computational convenience: (1) We prove that the similarity measurement constructed by the framework can be recognized either as the cost of a parallel transport or as the length of a quasi-geodesic curve. (2) We extend commonly used divergences for equidimensional PD matrices to the non-equidimensional case. Examples include Kullback-Leibler divergence, Bhattacharyya divergence and R\'enyi divergence. We prove that these extensions enjoy the same consistency property as their counterpart for geodesic distance. (3) We apply our geometric framework to further extend those in (2) to similarity measurements for arbitrary PSD matrices. We also provide simple formulae to compute these similarity measurements in most situations.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (55)
  1. Comparing g: multivariate analysis of genetic variation in multiple populations. Heredity, 112(1):21–29, 2014.
  2. Log-euclidean metrics for fast and simple calculus on diffusion tensors. Magnetic Resonance in Medicine, 56(2):411–421, 2006.
  3. A. Barg and D. Nogin. Bounds on packings of spheres in the grassmann manifold. IEEE Transactions on Information Theory, 48(9):2450–2454, 2002.
  4. V. P. Belavkin and P. Staszewski. c∗superscript𝑐∗c^{\ast}italic_c start_POSTSUPERSCRIPT ∗ end_POSTSUPERSCRIPT-algebraic generalization of relative entropy and entropy. Annales de l’institut Henri Poincaré. Section A, Physique Théorique, 37(1):51–58, 1982.
  5. R. Bhatia. Positive Definite Matrices. Princeton University Press, USA, 2015.
  6. A. Bhattacharyya. On a measure of divergence between two statistical populations defined by their probability distributions. Bull. Calcutta Math. Soc., 35:99–109, 1943.
  7. P. J. Bickel and E. Levina. Covariance regularization by thresholding. The Annals of Statistics, 36(6):2577 – 2604, 2008.
  8. S. Bonnabel and R. Sepulchre. Riemannian metric and geometric mean for positive semidefinite matrices of fixed rank. SIAM Journal on Matrix Analysis and Applications, 31(3):1055–1070, 2010.
  9. J. Burbea and C. Rao. Entropy differential metric, distance and divergence measures in probability spaces: A unified approach. Journal of Multivariate Analysis, 12(4):575–596, 1982.
  10. Y. Cai and L.-H. Lim. Distances between probability distributions of different dimensions. IEEE Transactions on Information Theory, 68(6):4020–4031, 2022.
  11. Z. Chebbi and M. Moakher. Means of hermitian positive-definite matrices based on the log-determinant α𝛼\alphaitalic_α-divergence function. Linear Algebra and its Applications, 436(7):1872–1889, 2012.
  12. A. Cichocki and S.-i. Amari. Families of alpha-beta-and gamma-divergences: Flexible and robust measures of similarities. Entropy, 12(6):1532–1568, 2010.
  13. Log-determinant divergences revisited: Alpha-beta and gamma log-det divergences. Entropy, 17(5):2988–3034, 2015.
  14. Csiszar’s divergences for non-negative matrix factorization: Family of new algorithms. In International Conference on Independent Component Analysis and Signal Separation, pages 32–39. Springer, 2006.
  15. Packing lines, planes, etc.: Packings in grassmannian spaces. Experimental Mathematics, 5(2):139–159, 1996.
  16. M. M. Deza and E. Deza. Distances on Numbers, Polynomials, and Matrices, pages 229–246. Springer Berlin Heidelberg, Berlin, Heidelberg, 2016.
  17. The geometry of algorithms with orthogonality constraints. SIAM Journal on Matrix Analysis and Applications, 20(2):303–353, 1998.
  18. C. Ehresmann. Les connexions infinitésimales dans un espace fibré différentiable. In Séminaire Bourbaki, Vol. 1, pages Exp. No. 24, 153–168. Soc. Math. France, Paris, 1995.
  19. G. Fubini. Sulle metriche definite da una forme hermitiana. Atti del Reale Istituto Veneto di Scienze, Lettere ed Arti, 63:502––513, 1904.
  20. Matrix Computations - 4th Edition. Johns Hopkins University Press, Philadelphia, PA, 2013.
  21. J. Hamm and D. D. Lee. Grassmann discriminant analysis: A unifying view on subspace-based learning. In Proceedings of the 25th International Conference on Machine Learning, ICML ’08, page 376–383, New York, NY, USA, 2008. Association for Computing Machinery.
  22. M. Herdin and E. Bonek. A mimo correlation matrix based metric for characterizing non-stationarity. In Proceedings IST Mobile & Wireless Communications Summit. Lyon, France., June 2004.
  23. F. Hiai and D. Petz. Riemannian metrics on positive definite matrices related to means. Linear Algebra and its Applications, 430(11):3105–3130, 2009.
  24. F. Hiai and D. Petz. Riemannian metrics on positive definite matrices related to means. ii. Linear Algebra and its Applications, 436(7):2117–2136, 2012.
  25. Statistical prefiltering for mimo systems with linear receivers in the presence of transmit correlation. In The 57th IEEE Semiannual Vehicular Technology Conference, 2003. VTC 2003-Spring., volume 1, pages 267–271 vol.1, 2003.
  26. S. Kobayashi and K. Nomizu. Foundations of differential geometry. Vol. II. Wiley Classics Library. John Wiley & Sons, Inc., New York, 1996. Reprint of the 1969 original, A Wiley-Interscience Publication.
  27. Learning low-rank kernel matrices. In Proceedings of the 23rd international conference on Machine learning, pages 505–512, 2006.
  28. Low-rank kernel learning with bregman matrix divergences. Journal of Machine Learning Research, 10(2), 2009.
  29. S. Kullback and R. A. Leibler. On Information and Sufficiency. The Annals of Mathematical Statistics, 22(1):79 – 86, 1951.
  30. Learning the kernel matrix with semidefinite programming. J. Mach. Learn. Res., 5:27–72, dec 2004.
  31. Geometric distance between positive definite matrices of different dimensions. IEEE Transactions on Information Theory, 65(9):5401–5405, 2019.
  32. Jensen-shannon divergence as a measure of distinguishability between mixed quantum states. Physical Review A, 72:052310, 2005.
  33. E. Massart and P.-A. Absil. Quotient geometry with simple geodesics for the manifold of fixed-rank positive-semidefinite matrices. SIAM Journal on Matrix Analysis and Applications, 41(1):171–198, 2020.
  34. Curvature of the manifold of fixed-rank positive-semidefinite matrices endowed with the bures–wasserstein metric. In F. Nielsen and F. Barbaresco, editors, Geometric Science of Information, pages 739–748, Cham, 2019. Springer International Publishing.
  35. A. Mey and M. Loog. Improved generalization in semi-supervised learning: A survey of theoretical results. IEEE Transactions on Pattern Analysis and Machine Intelligence, 45(4):4747–4767, 2023.
  36. P. W. Michor. Topics in differential geometry, volume 93 of Graduate Studies in Mathematics. American Mathematical Society, Providence, RI, 2008.
  37. The algebraic degree of semidefinite programming. Mathematical Programming, 122(2):379–405, 2010.
  38. Quantum Computation and Quantum Information: 10th Anniversary Edition. Cambridge University Press, USA, 10th edition, 2011.
  39. A riemannian framework for tensor computing. International Journal of Computer Vision, 66(1):41–66, 2006.
  40. M. Pigliucci. Genetic variance–covariance matrices: A critique of the evolutionary quantitative genetics research program. Biology and Philosophy, 21(1):1–23, 2006.
  41. Gromov-wasserstein distances between gaussian distributions. Journal of Applied Probability, 59(4), 2022.
  42. S. Sra. A new metric on the manifold of kernel matrices with application to matrix geometric means. Advances in neural information processing systems, 25, 2012.
  43. C. Stein. Inadmissibility of the usual estimator for the variance of a normal distribution with unknown mean. Annals of the Institute of Statistical Mathematics, 16(1):155–160, 1964.
  44. E. Study. Kürzeste wege im komplexen gebiet. Mathematische Annalen, 60(3):321–378, 1905.
  45. Y. Thanwerdas and X. Pennec. O(n)-invariant riemannian metrics on spd matrices. arXiv preprint arXiv:2109.05768, 2021.
  46. H. Umegaki. Conditional expectation in an operator algebra. IV. Entropy and information. Kodai Mathematical Seminar Reports, 14(2):59 – 85, 1962.
  47. L. Vandenberghe and S. Boyd. Semidefinite programming. SIAM Review, 38(1):49–95, 1996.
  48. A. Wehrl. General properties of entropy. Rev. Mod. Phys., 50:221–260, Apr 1978.
  49. S. R. White. Density matrix formulation for quantum renormalization groups. Phys. Rev. Lett., 69:2863–2866, Nov 1992.
  50. L. Wolf and A. Shashua. Learning over sets using kernel principal angles. J. Mach. Learn. Res., 4(null):913–931, dec 2003.
  51. Y.-C. Wong. Differential geometry of grassmann manifolds. Proceedings of the National Academy of Sciences, 57(3):589–594, 1967.
  52. Y.-C. Wong. A class of schubert varieties. Journal of Differential Geometry, 4(1):37–51, 1970.
  53. C. Yasuko. Statistics on Special Manifolds Authors. Springer New York, New York, NY, 2012.
  54. K. Ye and L.-H. Lim. Schubert varieties and distances between subspaces of different dimensions. SIAM Journal on Matrix Analysis and Applications, 37(3):1176–1197, 2016.
  55. W. S. Yichi Zhang and D. Kong. Covariance estimation for matrix-valued data. Journal of the American Statistical Association, 0(0):1–12, 2022.

Summary

We haven't generated a summary for this paper yet.