Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
194 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Local transfer learning from one data space to another (2302.00160v2)

Published 1 Feb 2023 in cs.LG and stat.ML

Abstract: A fundamental problem in manifold learning is to approximate a functional relationship in a data chosen randomly from a probability distribution supported on a low dimensional sub-manifold of a high dimensional ambient Euclidean space. The manifold is essentially defined by the data set itself and, typically, designed so that the data is dense on the manifold in some sense. The notion of a data space is an abstraction of a manifold encapsulating the essential properties that allow for function approximation. The problem of transfer learning (meta-learning) is to use the learning of a function on one data set to learn a similar function on a new data set. In terms of function approximation, this means lifting a function on one data space (the base data space) to another (the target data space). This viewpoint enables us to connect some inverse problems in applied mathematics (such as inverse Radon transform) with transfer learning. In this paper we examine the question of such lifting when the data is assumed to be known only on a part of the base data space. We are interested in determining subsets of the target data space on which the lifting can be defined, and how the local smoothness of the function and its lifting are related.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (59)
  1. A weighted transplantation theorem for jacobi coefficients. Journal of Approximation Theory, 248:105297, 2019.
  2. Higher transcendental functions, volume 2. McGraw-Hill New York, 1955.
  3. Regularization and semi-supervised learning on large graphs. In Learning theory, pages 624–638. Springer, 2004.
  4. M. Belkin and P. Niyogi. Laplacian eigenmaps for dimensionality reduction and data representation. Neural computation, 15(6):1373–1396, 2003.
  5. M. Belkin and P. Niyogi. Semi-supervised learning on Riemannian manifolds. Machine learning, 56(1-3):209–239, 2004.
  6. A. L. Bertozzi and A. Flenner. Diffuse interface models on graphs for classification of high dimensional data. Multiscale Modeling & Simulation, 10(3):1090–1118, 2012.
  7. M. Cheney and B. Borden. Fundamentals of radar imaging. SIAM, 2009.
  8. Special issue: Diffusion maps and wavelets. Appl. and Comput. Harm. Anal., 21(1), 2006.
  9. Deep nets for local manifold learning. Frontiers in Applied Mathematics and Statistics, 4:12, 2018.
  10. C. K. Chui and J. Wang. Dimensionality reduction of hyperspectral imagery data for feature classification. In Handbook of Geomathematics, pages 1005–1047. Springer, 2010.
  11. C. K. Chui and J. Wang. Randomized anisotropic transform for nonlinear dimensionality reduction. GEM-International Journal on Geomathematics, 1(1):23–50, 2010.
  12. C. K. Chui and J. Wang. Nonlinear methods for dimensionality reduction. In Handbook of Geomathematics, pages 1–46. Springer, 2015.
  13. Diffusion maps for changing data. Applied and Computational Harmonic Analysis, 36(1):79–107, 2014.
  14. R. R. Coifman and S. Lafon. Diffusion maps. Applied and computational harmonic analysis, 21(1):5–30, 2006.
  15. R. R. Coifman and M. Maggioni. Diffusion wavelets. Applied and Computational Harmonic Analysis, 21(1):53–94, 2006.
  16. L. D. David and G. Carrie. Hessian eigenmaps: new locally linear embedding techniques for high dimensional data, tr2003-08, dept. of statistics, 2003.
  17. E. B. Davies. Heat kernels and spectral theory, volume 92. Cambridge University Press, 1990.
  18. Discrete–continuous jacobi–sobolev spaces and fourier series. Bulletin of the Malaysian Mathematical Sciences Society, 44:571–598, 2021.
  19. M. P. do Carmo Valero. Riemannian geometry. Birkhäuser, 1992.
  20. D. L. Donoho and C. Grimes. Image manifolds which are isometric to euclidean space. Journal of mathematical imaging and vision, 23(1):5–24, 2005.
  21. Multiscale geometric analysis for 3d catalogs. In Astronomical Telescopes and Instrumentation, pages 101–111. International Society for Optics and Photonics, 2002.
  22. Locally learning biomedical data using diffusion frames. Journal of Computational Biology, 19(11):1251–1264, 2012.
  23. F. Filbir and H. N. Mhaskar. A quadrature formula for diffusion polynomials corresponding to a generalized heat kernel. Journal of Fourier Analysis and Applications, 16(5):629–657, 2010.
  24. F. Filbir and H. N. Mhaskar. Marcinkiewicz–Zygmund measures on manifolds. Journal of Complexity, 27(6):568–596, 2011.
  25. J. Friedman and J.-P. Tillich. Wave equations for graphs and the edge-based laplacian. Pacific Journal of Mathematics, 216(2):229–266, 2004.
  26. D. Geller and I. Z. Pesenson. Band-limited localized Parseval frames and Besov spaces on compact homogeneous manifolds. Journal of Geometric Analysis, 21(2):334–371, 2011.
  27. A. Grigor’yan. Upper bounds of derivatives of the heat kernel on an arbitrary complete manifold. Journal of Functional Analysis, 127(2):363–389, 1995.
  28. A. Grigor’yan. Gaussian upper bounds for the heat kernel on arbitrary manifolds. J. Diff. Geom., 45:33–52, 1997.
  29. Face recognition using Laplacianfaces. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 27(3):328–340, 2005.
  30. Manifold parametrizations by eigenfunctions of the Laplacian and heat kernels. Proceedings of the National Academy of Sciences, 105(6):1803–1808, 2008.
  31. Universal local parametrizations via heat kernels and eigenfunctions of the Laplacian. Ann. Acad. Sci. Fenn. Math., 35:131–174, 2010.
  32. Multi-resolutional shape features via non-euclidean wavelets: Applications to statistical analysis of cortical thickness. NeuroImage, 93:107–123, 2014.
  33. Y. A. Kordyukov. Lpsuperscript𝐿𝑝{L}^{p}italic_L start_POSTSUPERSCRIPT italic_p end_POSTSUPERSCRIPT–theory of elliptic differential operators on manifolds of bounded geometry. Acta Applicandae Mathematica, 23(3):223–260, 1991.
  34. Focal decline of cortical thickness in alzheimer’s disease identified by computational neuroanatomy. Cerebral cortex, 15(7):995–1001, 2005.
  35. A discriminative model for age invariant face recognition. Information Forensics and Security, IEEE Transactions on, 6(3):1028–1037, 2011.
  36. M. Maggioni and H. N. Mhaskar. Diffusion polynomial frames on metric measure spaces. Applied and Computational Harmonic Analysis, 24(3):329–353, 2008.
  37. Transferability of graph neural networks: an extended graphon approach. Applied and Computational Harmonic Analysis, 63:48–83, 2023.
  38. Sparse coding for multitask and transfer learning. In International conference on machine learning, pages 343–351. PMLR, 2013.
  39. H. Mhaskar. A unified framework for harmonic analysis of functions on directed graphs and changing data. Applied and Computational Harmonic Analysis, 44(3):611–644, 2018.
  40. H. N. Mhaskar. Eignets for function approximation on manifolds. Applied and Computational Harmonic Analysis, 29(1):63–87, 2010.
  41. H. N. Mhaskar. A generalized diffusion frame for parsimonious representation of functions on data defined manifolds. Neural Networks, 24(4):345–359, 2011.
  42. H. N. Mhaskar. Kernel-based analysis of massive data. Frontiers in Applied Mathematics and Statistics, 6:30, 2020.
  43. A deep learning approach to diabetic blood glucose prediction. Frontiers in Applied Mathematics and Statistics, 3:14, 2017.
  44. B. Muckenhoupt. Transplantation theorems and multiplier theorems for Jacobi series. American Mathematical Soc., 1986.
  45. C. Müller. Spherical harmonics, volume 17. Springer, 2006.
  46. A tomographic formulation of spotlight-mode synthetic aperture radar. Proceedings of the IEEE, 71(8):917–925, 1983.
  47. F. Natterer. The mathematics of computerized tomography. SIAM, 2001.
  48. Meta-learning based blood glucose predictor for diabetic smartphone app, 2014.
  49. C. J. Nolan and M. Cheney. Synthetic aperture inversion. Inverse Problems, 18(1):221, 2002.
  50. A. Nowak and P. Sjögren. Sharp estimates of the jacobi heat kernel. arXiv preprint arXiv:1111.3145, 2011.
  51. Nonlinear dimensionality reduction by locally linear embedding. Science, 290(5500):2323–2326, 2000.
  52. J. Schmidt-Hieber. Deep ReLU network approximation of functions on a manifold. arXiv preprint arXiv:1908.00695, 2019.
  53. Provable approximation properties for deep neural networks. Applied and Computational Harmonic Analysis, 44(3):537–557, 2018.
  54. A. Sikora. Riesz transform, Gaussian bounds and the method of wave equation. Mathematische Zeitschrift, 247(3):643–662, 2004.
  55. G. Szegö. Orthogonal polynomials. In Colloquium publications/American mathematical society, volume 23. Providence, 1975.
  56. A global geometric framework for nonlinear dimensionality reduction. Science, 290(5500):2319–2323, 2000.
  57. Community detection using spectral clustering on sparse geosocial data. SIAM Journal on Applied Mathematics, 73(1):67–83, 2013.
  58. Nonlinear dimensionality reduction by semidefinite programming and kernel matrix factorization. In Proceedings of the tenth international workshop on artificial intelligence and statistics, pages 381–388. Citeseer, 2005.
  59. Principal manifolds and nonlinear dimensionality reduction via tangent space alignment. Journal of Shanghai University (English Edition), 8(4):406–424, 2004.

Summary

We haven't generated a summary for this paper yet.