Papers
Topics
Authors
Recent
Search
2000 character limit reached

Spectral Beltrami Network (SBN)

Updated 9 February 2026
  • Spectral Beltrami Network (SBN) is a neural architecture that unifies spectral theory and deep learning to perform differentiable, bijective quasiconformal mappings.
  • It employs truncated spectral expansions of Beltrami coefficients using spherical harmonics and Laplacian eigenfunctions to reduce dimensionality and control smoothness.
  • The architecture integrates multiscale GNN layers with a U-Net framework, achieving state-of-the-art performance in geometry processing, medical imaging, and fluid dynamics simulations.

The Spectral Beltrami Network (SBN) is a neural architecture designed to provide a fully differentiable, spectrally informed solution to quasiconformal mapping and diffeomorphic surface registration problems, with rigorous control over conformal distortion and bijectivity. SBN unifies spectral theory, mesh-based geometry processing, and deep learning surrogates for variational partial differential equation (PDE) solvers, offering state-of-the-art performance for complex registration and parameterization tasks on planar, spherical, and cylindrical domains (Xu et al., 2 Feb 2026, Xu et al., 12 Nov 2025, Fré, 9 Dec 2025).

1. Mathematical Foundations: Beltrami Coefficient and Quasiconformal Mapping

SBN rests on the Beltrami equation, a fundamental tool for diffeomorphic mappings in complex geometry: zˉf(z)=μ(z)zf(z),μ<1,\partial_{\bar z} f(z) = \mu(z)\, \partial_z f(z), \qquad \|\mu\|_\infty < 1, where μ\mu is the Beltrami coefficient—a complex-valued field measuring local conformal distortion. Surface parameterizations and self-maps constrained by μ\mu can control angular distortion and local-scale change while enforcing bijectivity.

The Least-Squares Quasiconformal (LSQC) energy formalizes this into a variational problem: ELSQC(f,μ)=12ΩPu+JPv2dxdy,E_{\rm LSQC}(f, \mu) = \frac{1}{2} \int_\Omega \| P \nabla u + J P \nabla v \|^2 \, dx \, dy, with f=(u,v)f = (u, v), PP an explicit function of μ\mu, and JJ the standard symplectic matrix. The minimizer subject to pinning two points yields a unique, similarity-invariant mapping provided μ<1\|\mu\|_\infty < 1 (Xu et al., 12 Nov 2025).

On genus-0 (spherical) surfaces, a canonical two-chart stereographic atlas is used: the "north" chart PNP_N and "south" chart PSP_S, each with its own Beltrami field (μN,μS)(\mu_N, \mu_S), related by a transformation on their overlap. The Measurable Riemann Mapping Theorem ensures that every such pair corresponds to a unique homeomorphism fμ:S2S2f^{\boldsymbol{\mu}}: \mathbb{S}^2 \to \mathbb{S}^2 up to Möbius post-composition (Xu et al., 2 Feb 2026).

2. Spectral Parametrization of Beltrami Fields

SBN encodes the Beltrami coefficient using a truncated spectral expansion. On smooth spheres or disks, the Beltrami field is expanded in spherical harmonics or Laplace–Beltrami mesh eigenfunctions. Explicitly, for the sphere: μ(θ,ϕ)=n=0Nm=nncnmYnm(θ,ϕ),\mu(\theta, \phi) = \sum_{n=0}^N \sum_{m=-n}^n c_{nm} Y_n^m(\theta, \phi), where YnmY_n^m are spherical harmonics. For planar or surface meshes, the first KK nonzero cotangent Laplacian eigenfunctions {φk}\{\varphi_k\} provide the basis. For cylindrical or tubular domains appearing in fluid dynamics, SBN leverages complete orthogonal sets of vector-valued eigenmodes (e.g., Bessel/trigonometric functions for axisymmetric flows), decomposing into Beltrami, anti-Beltrami, and closed harmonic forms (Fré, 9 Dec 2025).

Truncating the spectral expansion reduces parameter dimensionality, allows for smoothness control, and enables efficient neural optimization in the latent coefficient space.

3. SBN Neural Architecture and Differentiable Surrogate

The core SBN architecture builds a neural surrogate for the LSQC variational solution:

  • Input: Discrete Beltrami field {μv}vV\{\mu_v\}_{v\in V} at mesh vertices or spectral coefficients {cnm}\{c_{nm}\}, and two "pinned" points to fix conformal freedom.
  • Spectral layers: Project features into mesh eigenbases Φ\Phi, with spectral mixing by trainable tensors RR. Each spectral band (low, medium, high frequency) uses separate filters.
  • Multiscale message passing: Hierarchical Graph Neural Network (GNN) layers operating over multiple mesh resolutions, combined with pooling/unpooling, propagate local and global geometric context.
  • U-Net hierarchy: Deep hierarchical architecture alternates spectral and message-passing modules across resolutions.
  • Output: Approximates the planar (or spherical/stereographic) quasiconformal map ff satisfying the discrete LSQC equation and pinning constraints.

The network is trained end-to-end by minimizing the LSQC loss, vertex/edge/region alignment terms, and regularization losses (angle distortion, smoothness, and seam consistency for two-chart representations) (Xu et al., 2 Feb 2026, Xu et al., 12 Nov 2025). The LSQC loss implements a differentiable, locally-resolved surrogate to the classical solver, suitable for back-propagation in learning or optimization contexts.

4. Optimization Frameworks: BOOST and SBN-Opt

SBN enables plug-in optimization frameworks for task-driven diffeomorphic mapping via gradient descent over Beltrami parameters:

  • BOOST (Beltrami Optimization On Stereographic Two-charts) targets genus-0 spherical parameterization by optimizing two Beltrami fields on overlapping hemispheres. Constraints include seam consistency, folding penalties (Jacobian positivity), and global coordinate alignment. Loss terms aggregate landmark alignment, intensity (feature) alignment, and distortion regularization (Xu et al., 2 Feb 2026).
  • SBN-Opt provides a general scheme for free-boundary diffeomorphism problems with unconstrained domain boundaries. It iteratively adjusts the Beltrami field, pinning locations, and post-similarity transformation to minimize task-specific objectives (e.g., density-equalizing measures, region correspondence, geometric feature alignment) with explicit angle distortion and smoothness regularization (Xu et al., 12 Nov 2025).

Both frameworks use the pretrained SBN as a frozen differentiable forward map; parameters are optimized via Adam over \sim200–500 iterations. Bijectivity constraint is softly enforced by parameterizing μ=tanh(μ~)\mu = \tanh(\tilde{\mu}) or introducing a log–barrier on μ<1\|\mu\|_\infty < 1.

5. Domain-Specific Extensions: Fluid Dynamics and Spectral PINNs

For axisymmetric fluid-dynamic problems, SBN can be instantiated as a spectral PINN (Physics-Informed Neural Network), with spectral coefficients as its primary variables (Fré, 9 Dec 2025). The domain is discretized with respect to cylindrical harmonics (Bessel/trigonometric modes), and the time-dependent Navier–Stokes system is projected mode-by-mode, yielding a coupled quadratic ODE hierarchy. The neural architecture outputs time-dependent coefficient trajectories, while the physics layer computes residuals of the PDE in the spectral basis. Training loss aggregates PDE residuals, initial/boundary conditions, and spectral regularization.

Pre-computation of mode coupling tensors (structure constants), exact inner products, and eigenvalues offers efficiency gains as nonlinearity evaluation is confined to spectral space. This paradigm drastically reduces computational complexity for highly symmetric or spectrally sparse flows, extending naturally to general bounded domains through basis adaptation.

6. Empirical Results and Performance Analysis

SBN and its frameworks yield state-of-the-art empirical outcomes across geometry processing, medical imaging, and graphics:

Task Metric SBN / BOOST Value Baseline Value
Landmark registration (cortex) Chamfer error 0.0005–0.004 0.0017–0.0083 (FLASH)
Angle distortion μ\|\mu\| 0.10\approx 0.10 0.12\approx 0.12
Sulcal depth (Pearson's rr) Correlation 0.884\approx 0.884
Parcellation (Dice) Similarity 0.968\approx 0.968
Density-equalizing param. (variance) Variance reduction Δρ\Delta\rho >98%> 98\%
Inconsistent reg. (avg μ|\mu|) Reduction 11%-11\% to 39%-39\%
Bijectivity Jacobian foldings 0

All mappings maintain bijectivity (zero foldings), biologically plausible surface deformations, and concentrated distortion near features of interest. In ablation studies, removal of spectral layers or message-passing networks doubles error, confirming the synergistic roles of local and global operators (Xu et al., 12 Nov 2025, Xu et al., 2 Feb 2026).

7. Implementation Practices, Hyperparameters, and Limitations

Best practices identified for SBN training and deployment:

  • Mesh preparation: Uniform triangulations (e.g., via DistMesh), avoidance of degenerate faces, and area normalization.
  • Spectral computation: Efficient eigenpair computation via Lanczos/ARPACK, sparse storage of spectral transforms, and caching repeated operations.
  • Hyperparameters: Typical K=100K=100–$200$ eigenmodes, latent dimensions d=24d=24, three resolution levels with progressively more message-passing iterations, batch training (size 8), and learning rates (e.g., 3×1043\times 10^{-4}).
  • Scaling: Spectral truncation scales as k=O(logN)k=O(\log N) for large meshes. GPU batching is used for efficiency.

Limitations include reliance on high-quality mesh discretizations for spectral accuracy, sensitivity to pinning point features (15% error increase if omitted), and the need for careful regularization in high-frequency bands to avoid spectral blocking.

8. Significance and Outlook

SBN establishes a bridge between rigorous geometric analysis (Beltrami theory, spectral PDEs) and gradient-based learning frameworks. By rendering quasiconformal theory differentiable, it opens new avenues for task-driven, distortion-controlled surface mapping with explicit bijection guarantees. The spectral PINN variant further extends SBN to time-dependent PDEs in physics, especially for symmetric flows where modal sparsity can be exploited.

A plausible implication is the broader adoption of SBN-like architectures in geometric deep learning, scientific computing, and medical image analysis, where diffeomorphic, low-distortion mappings are essential (Xu et al., 2 Feb 2026, Fré, 9 Dec 2025, Xu et al., 12 Nov 2025).

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Spectral Beltrami Network (SBN).