Papers
Topics
Authors
Recent
Search
2000 character limit reached

Surrogate to Poincaré inequalities on manifolds for dimension reduction in nonlinear feature spaces

Published 3 May 2025 in math.NA, cs.LG, and cs.NA | (2505.01807v2)

Abstract: We aim to approximate a continuously differentiable function $u:\mathbb{R}d \rightarrow \mathbb{R}$ by a composition of functions $f\circ g$ where $g:\mathbb{R}d \rightarrow \mathbb{R}m$, $m\leq d$, and $f : \mathbb{R}m \rightarrow \mathbb{R}$ are built in a two stage procedure. For a fixed $g$, we build $f$ using classical regression methods, involving evaluations of $u$. Recent works proposed to build a nonlinear $g$ by minimizing a loss function $\mathcal{J}(g)$ derived from Poincar\'e inequalities on manifolds, involving evaluations of the gradient of $u$. A problem is that minimizing $\mathcal{J}$ may be a challenging task. Hence in this work, we introduce new convex surrogates to $\mathcal{J}$. Leveraging concentration inequalities, we provide sub-optimality results for a class of functions $g$, including polynomials, and a wide class of input probability measures. We investigate performances on different benchmarks for various training sample sizes. We show that our approach outperforms standard iterative methods for minimizing the training Poincar\'e inequality based loss, often resulting in better approximation errors, especially for rather small training sets and $m=1$.

Authors (2)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.