2000 character limit reached
A kernel-based analysis of Laplacian Eigenmaps (2402.16481v1)
Published 26 Feb 2024 in math.ST, math.PR, math.SP, stat.ML, and stat.TH
Abstract: Given i.i.d. observations uniformly distributed on a closed manifold $\mathcal{M}\subseteq \mathbb{R}p$, we study the spectral properties of the associated empirical graph Laplacian based on a Gaussian kernel. Our main results are non-asymptotic error bounds, showing that the eigenvalues and eigenspaces of the empirical graph Laplacian are close to the eigenvalues and eigenspaces of the Laplace-Beltrami operator of $\mathcal{M}$. In our analysis, we connect the empirical graph Laplacian to kernel principal component analysis, and consider the heat kernel of $\mathcal{M}$ as reproducing kernel feature map. This leads to novel points of view and allows to leverage results for empirical covariance operators in infinite dimensions.
- F. Bach. Information theory with kernel methods. IEEE Trans. Inform. Theory, 69(2):752–775, 2023.
- Mathematics of data science. Book in preparation.
- M. Belkin and P. Niyogi. Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation, 15(6):1373–1396, 2003.
- M. Belkin and P. Niyogi. Convergence of laplacian eigenmaps. In Neural Information Processing Systems, 2006.
- M. Belkin and P. Niyogi. Towards a theoretical foundation for Laplacian-based manifold methods. J. Comput. System Sci., 74(8):1289–1308, 2008.
- R. Bhatia. Matrix analysis. Springer-Verlag, New York, 1997.
- G. Blanchard and J.-B. Fermanian. Nonasymptotic one- and two-sample tests in high dimension with unknown covariance structure. In Foundations of modern statistics, volume 425 of Springer Proc. Math. Stat., pages 121–162. Springer, Cham, [2023] ©2023.
- Concentration inequalities: a nonasymptotic theory of independence. Oxford University Press, Oxford, 2013.
- A graph discretization of the Laplace-Beltrami operator. J. Spectr. Theory, 4(4):675–714, 2014.
- J. Calder and N. García Trillos. Improved spectral convergence rates for graph Laplacians on ε𝜀\varepsilonitalic_ε-graphs and k𝑘kitalic_k-NN graphs. Appl. Comput. Harmon. Anal., 60:123–175, 2022.
- F. Chatelin. Spectral approximation of linear operators. Society for Industrial and Applied Mathematics (SIAM), Philadelphia, PA, 2011.
- I. Chavel. Eigenvalues in Riemannian geometry. Academic Press, Inc., Orlando, FL, 1984.
- X. Cheng and N. Wu. Eigen-convergence of Gaussian kernelized graph Laplacian by manifold heat interpolation. Appl. Comput. Harmon. Anal., 61:132–190, 2022.
- F. R. K. Chung. Spectral graph theory. Published for the Conference Board of the Mathematical Sciences, Washington, DC; by the American Mathematical Society, Providence, RI, 1997.
- R. R. Coifman and S. Lafon. Diffusion maps. Appl. Comput. Harmon. Anal., 21(1):5–30, 2006.
- E. B. Davies. Heat kernels and spectral theory. Cambridge University Press, Cambridge, 1990.
- Kernel ridge vs. principal component regression: minimax bounds and the qualification of regularization operators. Electron. J. Stat., 11(1):1022–1047, 2017.
- Error estimates for spectral convergence of the graph Laplacian on random geometric graphs toward the Laplace-Beltrami operator. Found. Comput. Math., 20(4):827–887, 2020.
- E. Giné and V. Koltchinskii. Empirical graph Laplacian approximation of Laplace-Beltrami operators: large sample results. In High dimensional probability, volume 51 of IMS Lecture Notes Monogr. Ser., pages 238–259. Inst. Math. Statist., Beachwood, OH, 2006.
- Exponential and moment inequalities for U𝑈Uitalic_U-statistics. In High dimensional probability, II (Seattle, WA, 1999), volume 47 of Progr. Probab., pages 13–38. Birkhäuser Boston, Boston, MA, 2000.
- A. Grigor’yan. Heat kernel and analysis on manifolds. American Mathematical Society, Providence, RI; International Press, Boston, MA, 2009.
- A. Grigor’yan. Estimates of heat kernels on Riemannian manifolds, page 140–225. London Mathematical Society Lecture Note Series. Cambridge University Press, 1999.
- T. Hsing and R. Eubank. Theoretical foundations of functional data analysis, with an introduction to linear operators. John Wiley & Sons, Ltd., Chichester, 2015.
- A note on the prediction error of principal component regression in high dimensions. Theory Probab. Math. Statist., 109:37–53, 2023.
- I. C. F. Ipsen. Relative perturbation results for matrix eigenvalues and singular values. In Acta numerica, 1998, volume 7 of Acta Numer., pages 151–201. Cambridge Univ. Press, Cambridge, 1998.
- M. Jirak and M. Wahl. Quantitative limit theorems and bootstrap for empirical spectral projectors. Available at https://arxiv.org/abs/2208.12871.
- M. Jirak and M. Wahl. Perturbation bounds for eigenspaces under a relative gap condition. Proc. Amer. Math. Soc., 148(2):479–494, 2020.
- M. Jirak and M. Wahl. Relative perturbation bounds with applications to empirical covariance operators. Adv. Math., 412:Paper No. 108808, 59 pp, 2023.
- I. M. Johnstone and D. Paul. PCA in high dimensions: An orientation. Proceedings of the IEEE, 106(8):1277–1292, 2018.
- I. Koch. Analysis of multivariate and high-dimensional data. Cambridge University Press, New York, 2014.
- V. Koltchinskii and E. Giné. Random matrix approximation of spectra of integral operators. Bernoulli, 6(1):113–167, 2000.
- Efficient estimation of linear functionals of principal components. Ann. Statist., 48(1):464–490, 2020.
- V. Koltchinskii and K. Lounici. Concentration inequalities and moment bounds for sample covariance operators. Bernoulli, 23(1):110–133, 2017.
- S. Minakshisundaram and Å. Pleijel. Some properties of the eigenfunctions of the Laplace-operator on Riemannian manifolds. Canad. J. Math., 1:242–256, 1949.
- M. Reiss and M. Wahl. Nonasymptotic upper bounds for the reconstruction error of PCA. Ann. Statist., 48(2):1098–1123, 2020.
- Introduction to differential geometry. Springer Spektrum, Wiesbaden, 2022.
- On learning with integral operators. J. Mach. Learn. Res., 11:905–934, 2010.
- S. Rosenberg. The Laplacian on a Riemannian manifold. Cambridge University Press, Cambridge, 1997.
- Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation, 10(5):1299–1319, 1998.
- Z. Shi. Convergence of laplacian spectra from random samples. Available at https://arxiv.org/abs/1507.00151.
- A. Singer. From graph to manifold Laplacian: the convergence rate. Appl. Comput. Harmon. Anal., 21(1):128–134, 2006.
- I. Steinwart and A. Christmann. Support vector machines. Springer, New York, 2008.
- J. A. Tropp. An Introduction to Matrix Concentration Inequalities. 2015.
- U. von Luxburg. A tutorial on spectral clustering. Stat. Comput., 17(4):395–416, 2007.
- Consistency of spectral clustering. Ann. Statist., 36(2):555–586, 2008.
- V. Q. Vu and J. Lei. Minimax sparse principal subspace estimation in high dimensions. Ann. Statist., 41(6):2905–2947, 2013.
- M. Wahl. On the perturbation series for eigenvalues and eigenprojections. Available at https://arxiv.org/abs/1910.08460.
- M. J. Wainwright. High-dimensional statistics. Cambridge University Press, Cambridge, 2019.