Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Nonsmooth Nonparametric Regression via Fractional Laplacian Eigenmaps (2402.14985v1)

Published 22 Feb 2024 in math.ST, stat.ML, and stat.TH

Abstract: We develop nonparametric regression methods for the case when the true regression function is not necessarily smooth. More specifically, our approach is using the fractional Laplacian and is designed to handle the case when the true regression function lies in an $L_2$-fractional Sobolev space with order $s\in (0,1)$. This function class is a Hilbert space lying between the space of square-integrable functions and the first-order Sobolev space consisting of differentiable functions. It contains fractional power functions, piecewise constant or polynomial functions and bump function as canonical examples. For the proposed approach, we prove upper bounds on the in-sample mean-squared estimation error of order $n{-\frac{2s}{2s+d}}$, where $d$ is the dimension, $s$ is the aforementioned order parameter and $n$ is the number of observations. We also provide preliminary empirical results validating the practical performance of the developed estimators.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (59)
  1. Anisotropic Laplace-Beltrami Operators for Shape Analysis. In Lourdes Agapito, Michael M. Bronstein, and Carsten Rother, editors, Computer Vision - ECCV 2014 Workshops, Lecture Notes in Computer Science, pages 299–312, 2015.
  2. Bilevel optimization, deep learning and fractional Laplacian regularization with applications in tomography. Inverse Problems, 36(6):064001, 2020.
  3. Fractional diffusion maps. Applied and Computational Harmonic Analysis, 54:145–175, 2021.
  4. Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation, 15(6):1373–1396, 2003.
  5. Towards a theoretical foundation for Laplacian-based manifold methods. In International Conference on Computational Learning Theory, pages 486–500. Springer, 2005.
  6. Manifold regularization: A geometric framework for learning from labeled and unlabeled examples. Journal of Machine Learning Research, 7(11), 2006.
  7. Emd-based signal noise reduction. International Journal of Signal Processing, 1(1):33–37, 2004.
  8. A graph discretization of the Laplace–Beltrami operator. Journal of Spectral Theory, 4(4):675–714, 2015.
  9. Improved spectral convergence rates for graph Laplacians on ε𝜀\varepsilonitalic_ε-graphs and k𝑘kitalic_k-NN graphs. Applied and Computational Harmonic Analysis, 60:123–175, 2022.
  10. Adaptive estimation of multivariate piecewise polynomials and bounded variation functions by optimal decision trees. The Annals of Statistics, 49(5):2531–2551, 2021.
  11. Piecewise-polynomial regression trees. Statistica Sinica, pages 143–167, 1994.
  12. Eigenvector selection with stepwise regression techniques to construct eigenvector spatial filters. Journal of Geographical Systems, 18:67–85, 2016.
  13. Diffusion maps. Applied and Computational Harmonic Analysis, 21(1):5–30, 2006.
  14. Hitchhiker’s guide to the fractional Sobolev spaces. Bulletin des Sciences Mathématiques, 136(5):521–573, 2012.
  15. David L Donoho. Cart and best-ortho-basis: a connection. The Annals of Statistics, 25(5):1870–1911, 1997.
  16. Ideal spatial adaptation by wavelet shrinkage. Biometrika, 81(3):425–455, 1994.
  17. Large data and zero noise limits of graph-based semi-supervised learning algorithms. Applied and Computational Harmonic Analysis, 49(2):655–697, 2020.
  18. Spectral convergence of graph Laplacian and heat kernel reconstruction in L∞subscript𝐿L_{\infty}italic_L start_POSTSUBSCRIPT ∞ end_POSTSUBSCRIPT from random samples. Applied and Computational Harmonic Analysis, 55:282–336, 2021.
  19. Multivariate extensions of isotonic regression and total variation denoising via entire monotonicity and Hardy–Krause variation. The Annals of Statistics, 49(2), 2021.
  20. Empirical graph Laplacian approximation of Laplace-Beltrami operators: large sample results. Lecture Notes-Monograph Series, pages 238–259, 2006.
  21. Minimax optimal regression over Sobolev spaces via Laplacian regularization on neighborhood graphs. In International Conference on Artificial Intelligence and Statistics, pages 2602–2610. PMLR, 2021.
  22. Minimax optimal regression over Sobolev spaces via Laplacian Eigenmaps on neighbourhood graphs. Information and Inference: A Journal of the IMA, 12(3):2423–2502, 2023.
  23. A distribution-free theory of nonparametric regression, volume 1. Springer, 2002.
  24. Topologically penalized regression on manifolds. The Journal of Machine Learning Research, 23(1):7233–7271, 2022.
  25. From graphs to manifolds–weak and strong pointwise consistency of graph Laplacians. In International Conference on Computational Learning Theory, pages 470–485. Springer, 2005.
  26. Graph laplacians and their convergence on random neighborhood graphs. Journal of Machine Learning Research, 8(6), 2007.
  27. Spectral analysis of weighted Laplacians arising in data clustering. Applied and Computational Harmonic Analysis, 56:189–249, 2022.
  28. The voronoigram: Minimax estimation of bounded variation functions from scattered data. arXiv:2212.14514, 2022.
  29. Optimal rates for total variation denoising. In Conference on Learning Theory, pages 1115–1146. PMLR, 2016.
  30. Deep neural networks learn non-smooth functions effectively. In The 22nd international conference on artificial intelligence and statistics, pages 869–878. PMLR, 2019.
  31. Penalized triograms: Total variation regularization for bivariate smoothing. Journal of the Royal Statistical Society Series B: Statistical Methodology, 66(1):145–163, 2004.
  32. Adaptive estimation of a quadratic functional by model selection. Annals of Statistics, pages 1302–1338, 2000.
  33. Variational mode decomposition denoising combined the detrended fluctuation analysis. Signal Processing, 125:349–364, 2016.
  34. Proto-value Functions: A Laplacian Framework for Learning Representation and Control in Markov Decision Processes. Journal of Machine Learning Research, 8(10), 2007.
  35. Enno Mammen and Sara Van De Geer. Locally adaptive regression splines. The Annals of Statistics, 25(1):387–413, 1997.
  36. On Spectral Clustering: Analysis and an Algorithm. In Neural Information Processing Systems: Natural and Synthetic, NIPS’01, pages 849–856, January 2001.
  37. John Rice. Bandwidth choice for nonparametric regression. The Annals of Statistics, pages 1215–1230, 1984.
  38. Ideal Bayesian spatial adaptation. arXiv:2105.12793, 2021.
  39. Yan Rybalko. Holder continuity of functions in the fractional Sobolev spaces: 1-dimensional case. arXiv:2308.06048, 2023.
  40. Total variation classes beyond 1111-d: Minimax rates, and the limitations of linear smoothers. Advances in Neural Information Processing Systems, 29, 2016.
  41. Higher-order total variation classes on grids: Minimax theory and trend filtering methods. Advances in Neural Information Processing Systems, 30, 2017.
  42. Block coordinate relaxation methods for nonparametric wavelet denoising. Journal of Computational and Graphical Statistics, 9(2):361–379, 2000.
  43. Robust wavelet denoising. IEEE Transactions on Signal Processing, 49(6):1146–1152, 2001.
  44. Minimax-optimal classification with dyadic decision trees. IEEE Transactions on Information Theory, 52(4):1335–1353, 2006.
  45. Jianbo Shi and J. Malik. Normalized cuts and image segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(8):888–905, August 2000. ISSN 1939-3539.
  46. Adaptive and non-adaptive minimax rates for weighted Laplacian-eigenmap based nonparametric regression. To appear in International Conference on Artificial Intelligence and Statistics (AISTATS), 2024.
  47. Zuoqiang Shi. Convergence of Laplacian spectra from random samples. arXiv:1507.00151, 2015.
  48. A concise and provably informative multi-scale signature based on heat diffusion. In Computer Graphics Forum, volume 28, pages 1383–1392. Wiley Online Library, 2009.
  49. Ryan J Tibshirani. Adaptive piecewise polynomial estimation via trend filtering. The Annals of Statistics, 42(1):285, 2014.
  50. A variational approach to the consistency of spectral clustering. Applied and Computational Harmonic Analysis, 45(2):239–281, 2018.
  51. Error estimates for spectral convergence of the graph Laplacian on random geometric graphs toward the Laplace–Beltrami operator. Foundations of Computational Mathematics, 20(4):827–887, 2020.
  52. Rates of Convergence for Regression with the Graph Poly-Laplacian. arXiv:2209.02305, 2022.
  53. Alexandre B. Tsybakov. Introduction to Nonparametric Estimation. Springer, 2008.
  54. Ulrike von Luxburg. A tutorial on spectral clustering. Statistics and Computing, 17:395–416, 2007.
  55. Trend filtering on graphs. In Artificial Intelligence and Statistics, pages 1042–1050. PMLR, 2015.
  56. Larry Wasserman. All of nonparametric statistics. Springer Science & Business Media, 2006.
  57. Consistency of Fractional Graph-Laplacian Regularization in Semi-Supervised Learning with Finite Labels. arXiv:2303.07818, 2023.
  58. Yair Weiss. Segmentation using eigenvectors: A unifying view. In Proceedings of the Seventh IEEE International Conference on Computer Vision, volume 2, pages 975–982. IEEE, 1999.
  59. The Laplacian in RL: Learning Representations with Efficient Approximations. In International Conference on Learning Representations, 2019.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets