Papers
Topics
Authors
Recent
Search
2000 character limit reached

GeometricKernels: Geometric Kernel Library

Updated 23 February 2026
  • GeometricKernels is a modular library offering implementations of heat and Matérn kernels for diverse non-Euclidean domains including manifolds, meshes, and graphs.
  • It leverages spectral expansion and Fourier feature approximations to enable efficient Gaussian process regression, interpolation, and uncertainty quantification on structured data.
  • The package supports automatic PSD-checking, a backend-agnostic API design, and scalable numerical methods, facilitating advanced geometric learning and operator approximations.

The GeometricKernels package is a modular, backend-agnostic software library for constructing, evaluating, and analyzing kernel functions on non-Euclidean geometric data domains—including Riemannian manifolds, discrete meshes, and general graphs. It systematically unifies classical and modern geometric kernel constructions, enabling principled and scalable Gaussian process (GP) modeling, interpolation, regression, and numerical geometry on structured data, with rigorous attention to operator-theoretic, numerical, and statistical properties across a diverse range of settings (Mostowsky et al., 2024).

1. Mathematical Principles and Kernel Constructions

At the core, GeometricKernels provides implementations of heat and Matérn kernels on arbitrary geometric domains. Let XX be a set such as a compact manifold, mesh, or graph, and k:X×XRk : X\times X \to \R be a kernel. A function kk is positive semi-definite (PSD) if its Gram matrix [k(xi,xj)][k(x_i, x_j)] is PSD for any finite subset {xi}X\{x_i\} \subset X; such kernels induce reproducing kernel Hilbert spaces (RKHSs) and serve as GP covariance functions.

Heat Kernel and Matérn Kernels

On a Riemannian manifold (or discretizations), the heat kernel P(t,x,x)P(t, x, x') solves tP(t,x,x)=ΔxP(t,x,x)\partial_t P(t, x, x') = \Delta_x P(t, x, x'), with the Laplace–Beltrami operator Δx\Delta_x. The heat kernel at "diffusion time" t=κ2/2t=\kappa^2/2 yields

k,κ,σ2(x,x)=σ2C,κP(κ2/2,x,x),k_{\infty,\kappa,\sigma^2}(x, x') = \frac{\sigma^2}{C_{\infty,\kappa}} P(\kappa^2/2, x, x'),

where C,κC_{\infty,\kappa} normalizes the marginal variance. For spaces with discrete Laplacian spectra, PP admits a spectral decomposition via eigenpairs (λn,ϕn)(\lambda_n, \phi_n) as

k,κ,σ2(x,x)=σ2C,κn=0exp((κ2/2)λn)ϕn(x)ϕn(x).k_{\infty,\kappa,\sigma^2}(x, x') = \frac{\sigma^2}{C_{\infty,\kappa}} \sum_{n=0}^\infty \exp\left(-(\kappa^2/2)\lambda_n\right) \phi_n(x) \phi_n(x').

The Matérn family is built as a gamma-mixture of heat kernels, indexed by a smoothness parameter ν>0\nu>0: kν,κ,σ2(x,x)=σ2Cν,κn=0Φν,κ(λn)ϕn(x)ϕn(x),k_{\nu,\kappa,\sigma^2}(x, x') = \frac{\sigma^2}{C_{\nu, \kappa}} \sum_{n=0}^\infty \Phi_{\nu, \kappa}(\lambda_n) \phi_n(x)\phi_n(x'), with Φν,κ(λ)=(2ν/κ2)ν(λ+2ν/κ2)(ν+n/2)\Phi_{\nu, \kappa}(\lambda) = (2\nu/\kappa^2)^{\nu} (\lambda + 2\nu/\kappa^2)^{-(\nu + n/2)}, ensuring analytic and PSD properties (Mostowsky et al., 2024).

Variational Interpolation, Regression, and Level-Set Geometry

GeometricKernels provides kernel-based approaches for interpolation and regression in an RKHS, with Tikhonov regularization

Jλ[f]=12fHK2+12λi=1N(f(xi)yi)2,J_\lambda[f] = \frac{1}{2}\|f\|_{H_K}^2 + \frac{1}{2\lambda} \sum_{i=1}^N (f(x_i)-y_i)^2,

where HKH_K is the RKHS induced by KK, and λ\lambda is the regularization parameter. The solution admits the representer theorem and leads to efficient algorithms to reconstruct hypersurfaces and compute geometric invariants (normals, curvatures, Laplace–Beltrami evaluations) directly from scattered data using level-set methods (Guidotti, 6 Feb 2026).

Local Kernel Framework and Metric Learning

The package's theoretical foundation includes local kernels, which, under appropriate normalization and moment expansion, discretely approximate generators of Itô processes and (symmetrically) the Laplace–Beltrami operator of an induced Riemannian geometry. This structure enables the construction of kernels that are conformally or diffeomorphism invariant and that approximate arbitrary target metrics (Berry et al., 2014).

Geodesic Distance, Flatness, and Positive-Definiteness

Design of kernels based on geodesic distances is governed by deep geometric constraints. A Gaussian kernel of the form kG(x,y)=exp(d(x,y)2/σ2)k_G(x,y) = \exp(-d(x,y)^2/\sigma^2) is PSD if and only if the underlying space is Alexandrov flat (isometric to Euclidean space). In contrast, the Laplacian kernel kL(x,y)=exp(d(x,y)/λ)k_L(x,y) = \exp(-d(x,y)/\lambda) is PSD on spaces where the distance is conditionally negative definite (CND), such as spheres and hyperbolic spaces (Feragen et al., 2014).

2. Numerical Methods and Feature Approximations

GeometricKernels implements two main strategies for kernel evaluation on discrete or large-scale data:

  • Spectral Expansion: For domains with discrete Laplace spectra (compact manifolds, meshes, graphs), it computes leading eigenpairs of the Laplacian using sparse linear algebra and evaluates kernels by truncating the spectral sum. For operator-theoretic consistency, all kernels are evaluated directly in terms of the eigenbasis.
  • Fourier-/Spectral-Feature Approximations: For large NN or noncompact spaces, the package employs random sampling of Laplacian eigenfunctions (weighted by the spectral profile) to construct finite-dimensional explicit feature maps ϕ:XR\phi : X \rightarrow \R^\ell such that k(x,x)ϕ(x)ϕ(x)k(x,x') \approx \phi(x)^\top\phi(x'). This reduces GP sampling and kernel-matrix multiplicative complexity from O(N3)O(N^3) to O(N)O(N\ell) without loss of PSD properties (Mostowsky et al., 2024).

In discrete domains (meshes, graphs), Laplacians are formed as combinatorial or cotangent-weight matrices, and graph kernels follow the same spectral machinery. All algorithms are implemented with careful normalization, variable-bandwidth extensions, and data-driven density corrections for robustness and statistical consistency (Berry et al., 2014).

3. Supported Spaces, Kernels, and PD Conditions

GeometricKernels exposes a broad suite of domain and kernel classes, each rigorously checking positive-definiteness conditions when constructed. Supported spaces include:

  • Riemannian manifolds (e.g., spheres, hyperbolic spaces, special orthogonal groups),
  • Discrete meshes (vertex/face),
  • Graphs (adjacency-based),
  • Product and composite domains.

Kernels include heat, Matérn, geodesic Laplacian (on CND spaces), and product/composite variants. The software enforces that the geodesic Gaussian kernel is only offered for flat spaces (Euclidean, SPD matrices under log-Euclidean metric), explicitly raising exceptions otherwise. Laplacian kernels are made available only on CND metric domains, such as all spheres, hyperbolic spaces, and certain statistical manifolds (Feragen et al., 2014).

The package provides automatic PSD-checking of Gram matrices for user-supplied data, with best-practice warnings when indefinite results arise from geometric obstructions.

4. Software Architecture, API Design, and Backend Support

GeometricKernels is implemented atop a backend-dispatch layer (LAB), supporting NumPy, JAX, PyTorch, and TensorFlow transparently. Core software modules include:

  • spaces/: e.g., Hypersphere(dim), Hyperbolic(dim), Mesh(vertices,faces), Graph(adjacency)
  • kernels/: MaternGeometricKernel(space), ProductGeometricKernel(*kernels)
  • feature_maps/: ExactFeatureMap, RandomSpectralMap
  • sampling/: sample_gp_prior(kernel, params, x, rng)
  • frontends/: integration with GPyTorch, GPJax, GPflow for end-to-end GP modeling

Classes such as MaternGeometricKernel offer methods for hyperparameter initialization (init_params), kernel evaluation (K), and feature map generation (feature_map). Differentiation is delegated to the underlying array backend, so gradients through eigen-decompositions and feature samples are usable in arbitrary ML pipelines for hyperparameter optimization and GP marginal likelihood learning (Mostowsky et al., 2024).

5. Applications, Performance, and Scalability

Principal use cases for GeometricKernels include GP regression and Bayesian optimization on non-Euclidean domains, uncertainty quantification on scientific and robotic manifolds, and geometric machine learning for surfaces and networks. The package is benchmarked for:

  • Full spectral kernel evaluation up to 10410^4 mesh vertices and 10510^5 graph nodes,
  • Efficient batch and large-scale computations via feature approximations,
  • Kernel evaluation and sampling scaling linearly in \ell (number of features) and quadratically in number of eigenpairs for full kernel matrices.

Eigen-computation is conducted either densely (O(N3)O(N^3)) or via sparse Lanczos (O(Nk2)O(N k^2)), with all backends supporting batched kernel evaluations (Mostowsky et al., 2024).

6. Extensions: Geometry Reconstruction, Operators, and Invariance

GeometricKernels generalizes standard kernel regression to unstructured data for geometry processing. Using kernel-based level-set representations, one obtains numerically robust surface reconstruction (as zero/one level sets of fλf_\lambda), tangent vectors, principal curvatures (Weingarten map), and meshfree approximations to the Laplace–Beltrami operator. The algorithms leverage explicit formulas for gradients, Hessians, and projections onto tangent spaces, enabling computation of geometric PDE solutions and manifold differential operators directly from scattered or noisy data (Guidotti, 6 Feb 2026).

The local kernel theory allows for construction of kernels that are invariant to sampling density (conformal invariance) or global geometric maps (diffeomorphisms), and gives recipes for synthesizing kernels that induce arbitrary target metrics via choice of anisotropic covariances (Berry et al., 2014).

7. Historical and Theoretical Context

The design of GeometricKernels is informed by advances in spectral theory of diffusion and heat kernels, local kernel theory, and geometry-aware machine learning:

The library coherently integrates operator theory, statistical learning, and computational geometry, representing the confluence of geometric data analysis, uncertainty quantification, and scalable machine learning for manifold-valued data.


References:

(Mostowsky et al., 2024) The GeometricKernels Package: Heat and Matérn Kernels for Geometric Learning on Manifolds, Meshes, and Graphs (Guidotti, 6 Feb 2026) Geometric Kernel Interpolation and Regression (Feragen et al., 2014) Geodesic Exponential Kernels: When Curvature and Linearity Conflict (Berry et al., 2014) Local Kernels and the Geometric Structure of Data

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to GeometricKernels Package.