Learning Beyond Euclid: Curvature-Adaptive Generalization for Neural Networks on Manifolds (2507.02999v1)
Abstract: In this work, we develop new generalization bounds for neural networks trained on data supported on Riemannian manifolds. Existing generalization theories often rely on complexity measures derived from Euclidean geometry, which fail to account for the intrinsic structure of non-Euclidean spaces. Our analysis introduces a geometric refinement: we derive covering number bounds that explicitly incorporate manifold-specific properties such as sectional curvature, volume growth, and injectivity radius. These geometric corrections lead to sharper Rademacher complexity bounds for classes of Lipschitz neural networks defined on compact manifolds. The resulting generalization guarantees recover standard Euclidean results when curvature is zero but improve substantially in settings where the data lies on curved, low-dimensional manifolds embedded in high-dimensional ambient spaces. We illustrate the tightness of our bounds in negatively curved spaces, where the exponential volume growth leads to provably higher complexity, and in positively curved spaces, where the curvature acts as a regularizing factor. This framework provides a principled understanding of how intrinsic geometry affects learning capacity, offering both theoretical insight and practical implications for deep learning on structured data domains.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.