Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 100 tok/s
Gemini 2.5 Pro 51 tok/s Pro
GPT-5 Medium 26 tok/s Pro
GPT-5 High 33 tok/s Pro
GPT-4o 103 tok/s Pro
Kimi K2 200 tok/s Pro
GPT OSS 120B 447 tok/s Pro
Claude Sonnet 4 36 tok/s Pro
2000 character limit reached

Kernel manifolds: nonlinear-augmentation dimensionality reduction using reproducing kernel Hilbert spaces (2509.00224v1)

Published 29 Aug 2025 in cs.CE, cs.NA, and math.NA

Abstract: This paper generalizes recent advances on quadratic manifold (QM) dimensionality reduction by developing kernel methods-based nonlinear-augmentation dimensionality reduction. QMs, and more generally feature map-based nonlinear corrections, augment linear dimensionality reduction with a nonlinear correction term in the reconstruction map to overcome approximation accuracy limitations of purely linear approaches. While feature map-based approaches typically learn a least-squares optimal polynomial correction term, we generalize this approach by learning an optimal nonlinear correction from a user-defined reproducing kernel Hilbert space. Our approach allows one to impose arbitrary nonlinear structure on the correction term, including polynomial structure, and includes feature map and radial basis function-based corrections as special cases. Furthermore, our method has relatively low training cost and has monotonically decreasing error as the latent space dimension increases. We compare our approach to proper orthogonal decomposition and several recent QM approaches on data from several example problems.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com