Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Least squares approximations in linear statistical inverse learning problems (2211.12121v3)

Published 22 Nov 2022 in math.ST, cs.NA, math.NA, and stat.TH

Abstract: Statistical inverse learning aims at recovering an unknown function $f$ from randomly scattered and possibly noisy point evaluations of another function $g$, connected to $f$ via an ill-posed mathematical model. In this paper we blend statistical inverse learning theory with the classical regularization strategy of applying finite-dimensional projections. Our key finding is that coupling the number of random point evaluations with the choice of projection dimension, one can derive probabilistic convergence rates for the reconstruction error of the maximum likelihood (ML) estimator. Convergence rates in expectation are derived with a ML estimator complemented with a norm-based cut-off operation. Moreover, we prove that the obtained rates are minimax optimal.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (54)
  1. Data driven regularization by projection. Inverse Problems, 36(12):125009, 2020.
  2. On regularization algorithms in learning theory. Journal of complexity, 23(1):52–72, 2007.
  3. Rajendra Bhatia. Matrix analysis, volume 169. Springer Science & Business Media, 2013.
  4. Consistency and rates of convergence of nonlinear Tikhonov regularization with random noise. Inverse Problems, 20(6):1773, 2004.
  5. Optimal rates for regularization of statistical inverse learning problems. Foundations of Computational Mathematics, 18(4):971–1013, 2018.
  6. Self-regularization of projection methods with a posteriori discretization level choice for severely ill-posed problems. Inverse Problems, 19(1):147, 2002.
  7. Convex regularization in statistical inverse learning problems. arXiv preprint arXiv:2102.09526, 2021.
  8. Optimal rates for the regularized least-squares algorithm. Foundations of Computational Mathematics, 7(3):331–368, 2007.
  9. Laurent Cavalier. Nonparametric statistical inverse problems. Inverse Problems, 24(3):034004, 2008.
  10. Sharp adaptation for inverse problems with random noise. Probability Theory and Related Fields, 123(3):323–354, 2002.
  11. Statistical approach to some ill-posed problems for linear partial differential equations. Probability theory and related fields, 113(3):421–441, 1999.
  12. On the stability and accuracy of least squares approximations. Foundations of computational mathematics, 13(5):819–834, 2013.
  13. Best choices for regularization parameters in learning theory: on the bias-variance problem. Foundations of Computational Mathematics, 2(4):413–428, 2002.
  14. Error controlled regularization by projection. Electronic transactions on numerical analysis, 25:67–100, 2006.
  15. Discretization error analysis for Tikhonov regularization. Analysis and Applications, 4(01):81–99, 2006.
  16. David L Donoho. Nonlinear solution of linear inverse problems by wavelet-vaguelette decomposition. Applied and computational harmonic analysis, 2(2):101–126, 1995.
  17. Regularization of inverse problems, volume 375. Springer Science & Business Media, 1996.
  18. CW Groetsch. Convergence of a general projection method for an operator equation of the first kind. In Houston J. Mathem. Citeseer, 1988.
  19. CW Groetsch. Regularization by projection for unbounded operators arising in inverse. In Proceedings International Workshop On lnverse Problems HoChiMinh City Jan, volume 17, pages 61–70, 1995.
  20. Learning theory of distributed spectral algorithms. Inverse Problems, 33(7):074009, 2017.
  21. On the solution of ill-posed problems by projection methods with a posteriori choice of the discretization level. Mathematical Modelling and Analysis, 7(2):241–252, 2002.
  22. Regularization by projection: Approximation theoretic aspects and distance functions. 2007.
  23. Discretization effects in statistical inverse problems. Journal of complexity, 7(1):1–34, 1991.
  24. Barbara Kaltenbacher. Regularization by projection with a posteriori discretization level choice for linear and nonlinear ill-posed problems. Inverse Problems, 16(5):1523, 2000.
  25. Barbara Kaltenbacher. On the regularizing properties of a full multigrid method for ill-posed problems. Inverse problems, 17(4):767, 2001.
  26. Barbara Kaltenbacher. V-cycle convergence of some multigrid methods for ill-posed problems. Mathematics of Computation, 72(244):1711–1730, 2003.
  27. A convergence analysis of regularization by discretization in preimage space. Mathematics of Computation, 81(280):2049–2069, 2012.
  28. Stefan Kindermann. Projection methods for ill-posed problems revisited. Computational Methods in Applied Mathematics, 16(2):257–276, 2016.
  29. Spectral algorithms for supervised learning. Neural Computation, 20(7):1873–1897, 2008.
  30. Balancing principle in supervised learning for a general regularization scheme. Applied and Computational Harmonic Analysis, 48(1):123–148, 2020.
  31. Mark A Lukas. Comparisons of parameter choice methods for regularization with discrete noisy data. Inverse Problems, 14(1):161, 1998.
  32. An adaptive discretization for tikhonov-phillips regularization with a posteriori parameter selection. 1998.
  33. Statistical inverse estimation in Hilbert scales. SIAM Journal on Applied Mathematics, 56(5):1424–1444, 1996.
  34. Optimal discretization of inverse problems in Hilbert scales. Regularization and self-regularization of projection methods. SIAM Journal on Numerical Analysis, 38(6):1999–2021, 2001.
  35. Regularization by projection in variable Hilbert scales. Applicable Analysis, 87(2):201–219, 2008.
  36. Regularization in kernel learning. The Annals of Statistics, 38(1):526–565, 2010.
  37. Nicole Mücke. Direct and inverse problems in machine learning. Doctoral thesis, Universität Potsdam, 2017.
  38. Frank Natterer. Regularisierung schlecht gestellter probleme durch projektionsverfahren. Numerische Mathematik, 28(3):329–341, 1977.
  39. Convergence rates for regularized solutions of integral equations from discrete noisy data. The Annals of Statistics, pages 556–572, 1989.
  40. Finbarr O’Sullivan. Convergence characteristics of methods of regularization estimators for nonlinear operator equations. SIAM Journal on Numerical Analysis, 27(6):1635–1649, 1990.
  41. Remarks on inequalities for large deviation probabilities. Theory of Probability & Its Applications, 30(1):143–148, 1986.
  42. R Plato and G Vainikko. On the regularization of projection methods for solving ill-posed problems. Numerische Mathematik, 57(1):63–79, 1990.
  43. Convergence analysis of Tikhonov regularization for non-linear statistical inverse learning problems. Electronic Journal of Statistics, 14(2):2798–2841, 2020.
  44. Junuthula Narasimha Reddy. Introduction to the finite element method. McGraw-Hill Education, 2019.
  45. Teresa Regińska. Two-parameter discrepancy principle for combined projection and Tikhonov regularization of ill-posed problems. Journal of Inverse and Ill-Posed Problems, 21(4):561–577, 2013.
  46. Thomas Seidman. Nonconvergence results for the application of least-squares estimation to ill-posed problems. Journal of Optimization Theory and Applications, 30(4):535–547, 1980.
  47. Shannon sampling II: Connections to learning theory. Applied and Computational Harmonic Analysis, 19(3):285–302, 2005.
  48. Learning theory estimates via integral operators and their approximations. Constructive approximation, 26(2):153–172, 2007.
  49. Optimal rates for regularized least squares regression. In COLT, pages 79–93, 2009.
  50. G Vainikko and U Hämarik. Self-regularization solving ill-posed problems by projection methods. Models and Methods in Operational Research, pages 157–164, 1988.
  51. Roman Vershynin. High-dimensional probability: An introduction with applications in data science, volume 47. Cambridge university press, 2018.
  52. Learning from examples as an inverse problem. Journal of Machine Learning Research, 6(May):883–904, 2005.
  53. Grace Wahba. Practical approximate solutions to linear operator equations when the data are noisy. SIAM journal on numerical analysis, 14(4):651–667, 1977.
  54. On early stopping in gradient descent learning. Constructive Approximation, 26(2):289–315, 2007.
Citations (1)

Summary

We haven't generated a summary for this paper yet.