Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 92 tok/s
Gemini 2.5 Pro 49 tok/s Pro
GPT-5 Medium 32 tok/s
GPT-5 High 40 tok/s Pro
GPT-4o 83 tok/s
GPT OSS 120B 467 tok/s Pro
Kimi K2 197 tok/s Pro
2000 character limit reached

Tangent-Space Optimizer

Updated 6 August 2025
  • Tangent-Space Optimizer is a manifold-based method that combines local tangent space mapping with the Fréchet mean to yield scalable, geometrically faithful function approximations.
  • It uses a single tangent space model by mapping data via the logarithmic map, performing regression in tangent spaces, and recovering predictions with the exponential map.
  • The method aggregates multiple local STSM predictions through a weighted Fréchet mean, significantly reducing computational overhead while retaining intrinsic geometry.

Approximating manifold-valued functions from data requires careful handling of the intrinsic geometry of the target manifold. The approach described in "Manifold-valued function approximation from multiple tangent spaces" (Wang et al., 17 Apr 2025) centers on leveraging multiple tangent spaces and the Riemannian Fréchet mean to provide scalable, accurate, and geometrically faithful function approximations. This synthesis details the key mathematical constructions, algorithms, and computational procedures underpinning the method, with a focus on its use of tangent-space pullbacks, the Fréchet mean, and their integration in both single and multiple tangent space models.

1. The Fréchet Mean on Riemannian Manifolds

The Fréchet mean, also referred to as the Karcher mean, generalizes the concept of the Euclidean (weighted) average to Riemannian manifolds. Given data points y1,...,ydy_1, ..., y_d in a Riemannian manifold MM and weights Φ=(ϕ1,...,ϕd)\Phi = (\phi_1, ..., \phi_d) in the standard simplex Δd1\Delta^{d-1}, the Fréchet mean is defined as

avgM(Y,Φ)=argminpMi=1dϕidistM(p,yi)2\mathrm{avg}_M(Y, \Phi) = \underset{p \in M}{\arg\min} \sum_{i=1}^d \phi_i \cdot \mathrm{dist}_M(p, y_i)^2

where distM(,)\mathrm{dist}_M(\cdot, \cdot) denotes the manifold's geodesic distance function. Under the condition that all points yiy_i are contained in a sufficiently small geodesic ball (with radius below a threshold determined by the injectivity radius and local sectional curvature), this objective is strictly convex, ensuring uniqueness of the Fréchet mean. This property is essential for well-posed aggregation of multiple manifold-valued estimates in subsequent methodologies.

2. Single Tangent Space Model (STSM) and Pullback Approximation

A foundational approach for manifold-valued function approximation is to choose a reference point pp_* on MM and work in the tangent space TpMT_{p_*}M. Function values yi=f(xi)y_i = f(x_i) can be mapped to TpMT_{p_*}M using the logarithmic map: vi=logp(yi)v_i = \log_{p_*}(y_i) A regression or interpolation is performed in the vector space TpMT_{p_*}M using input–output pairs (xi,vi)(x_i, v_i). The resulting prediction g(x)TpMg(x) \in T_{p_*}M for an unseen input xx is mapped back to the manifold by the exponential map,

f^(x)=expp(g(x))\hat{f}(x) = \exp_{p_*}(g(x))

This scheme is efficient and exploits the manifold's local linearity, but its validity and accuracy degrade as data move farther from the base point pp_*.

3. Riemannian Moving Least Squares (RMLS)

To address the limitations of single tangent space approximations, the Riemannian Moving Least Squares method replaces linear averaging with a weighted Fréchet mean. For each input xx, local weights ϕi(x)\phi_i(x) are computed—typically via radial basis functions or other smooth kernels centered at the input locations. The RMLS approximant is implicitly defined as

(Rf)(x)=argminpM i=1dϕi(x) distM(p,yi)2(\mathcal{R}f)(x) = \underset{p \in M}{\arg\min}~\sum_{i=1}^d \phi_i(x)~\mathrm{dist}_M(p, y_i)^2

By averaging on MM itself, RMLS provides a bias-free, geometrically natural interpolant. However, the computational burden is significant: for each evaluation, a weighted Fréchet mean must be computed over the entire dataset.

4. Multiple Tangent Space Model (MTSM): Model and Algorithm

The Multiple Tangent Space Model (MTSM) addresses the scalability issue of RMLS by localizing approximation to a small set of anchor points. The process is as follows:

  1. Anchor selection: Choose a small collection of anchor points {pj}\{p_j^*\} on MM.
  2. Local STSM fitting: For each anchor pjp_j^*, fit a Single Tangent Space Model with its local data, producing a family of predictions

f^j(x)=exppj(gj(x))\hat{f}_j(x) = \exp_{p_j^*}(g_j(x))

where gj(x)g_j(x) is the local regression in TpjMT_{p_j^*}M.

  1. Aggregation via Fréchet Mean: Instead of fusing all training outputs, only the STSM predictions for a given input xx are combined using the weighted Fréchet mean: (Mf)(x)=argminpM jϕj(f^j(x)) distM(p,f^j(x))2(\mathcal{M}f)(x) = \underset{p \in M}{\arg\min}~\sum_j \phi_j( \hat{f}_j(x) )~\mathrm{dist}_M(p, \hat{f}_j(x))^2 Here, ϕj\phi_j are weights assigned to each anchor/model according to, e.g., the proximity of xx to the anchor's region of validity. This drastically reduces the number of points over which the mean must be computed from dd (size of the dataset) to the number of models (often 2–3), enabling fast online evaluation.

5. Computational Aspects and Optimization Procedures

The Fréchet mean typically lacks a closed-form solution and is computed using iterative Riemannian optimization techniques. In practice, this optimization is performed by Riemannian gradient descent or trust-region methods, efficiently implemented in packages such as Manopt. In the MTSM and RMLS, these solvers are invoked at each evaluation or fusion step. The computational advantage of MTSM arises from restricting this operation to a small number of (STSM-based) predictions, allowing deployment in real-time or high-throughput applications. Uniqueness and stability of the mean are guaranteed as long as the involved points remain within a convex geodesic ball.

6. Applications and Experimental Effectiveness

The described multiple tangent space scheme is especially effective in contexts where the manifold structure is pronounced and computational efficiency is critical. In the model problems from parametric model order reduction presented in the paper, the MTSM enables accurate low-dimensional approximations of system responses while respecting the nonlinear geometry intrinsic to the solution manifold. The hierarchical combination—STSM local prediction followed by Fréchet mean fusion—yields lower approximation error than single tangent methods, and improved scalability relative to direct RMLS.

7. Theoretical and Practical Significance

By integrating tangent-space pullback with local averaging via the Fréchet mean, the method unifies two key principles in manifold approximation: exploiting local linearity (for efficiency and scalability) and ensuring global geometric fidelity (via intrinsic averaging). The approach avoids the linearization bias inherent in single chart methods, while circumventing the computational bottleneck of full-sample RMLS. Its reliance on foundational Riemannian concepts ensures broad applicability to function learning tasks where outputs are manifold-valued and the preservation of intrinsic geometry is paramount.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (1)