GeometricKernels: Geometric Kernel Library
- GeometricKernels is a modular library offering implementations of heat and Matérn kernels for diverse non-Euclidean domains including manifolds, meshes, and graphs.
- It leverages spectral expansion and Fourier feature approximations to enable efficient Gaussian process regression, interpolation, and uncertainty quantification on structured data.
- The package supports automatic PSD-checking, a backend-agnostic API design, and scalable numerical methods, facilitating advanced geometric learning and operator approximations.
The GeometricKernels package is a modular, backend-agnostic software library for constructing, evaluating, and analyzing kernel functions on non-Euclidean geometric data domains—including Riemannian manifolds, discrete meshes, and general graphs. It systematically unifies classical and modern geometric kernel constructions, enabling principled and scalable Gaussian process (GP) modeling, interpolation, regression, and numerical geometry on structured data, with rigorous attention to operator-theoretic, numerical, and statistical properties across a diverse range of settings (Mostowsky et al., 2024).
1. Mathematical Principles and Kernel Constructions
At the core, GeometricKernels provides implementations of heat and Matérn kernels on arbitrary geometric domains. Let be a set such as a compact manifold, mesh, or graph, and be a kernel. A function is positive semi-definite (PSD) if its Gram matrix is PSD for any finite subset ; such kernels induce reproducing kernel Hilbert spaces (RKHSs) and serve as GP covariance functions.
Heat Kernel and Matérn Kernels
On a Riemannian manifold (or discretizations), the heat kernel solves , with the Laplace–Beltrami operator . The heat kernel at "diffusion time" yields
where normalizes the marginal variance. For spaces with discrete Laplacian spectra, admits a spectral decomposition via eigenpairs as
The Matérn family is built as a gamma-mixture of heat kernels, indexed by a smoothness parameter : with , ensuring analytic and PSD properties (Mostowsky et al., 2024).
Variational Interpolation, Regression, and Level-Set Geometry
GeometricKernels provides kernel-based approaches for interpolation and regression in an RKHS, with Tikhonov regularization
where is the RKHS induced by , and is the regularization parameter. The solution admits the representer theorem and leads to efficient algorithms to reconstruct hypersurfaces and compute geometric invariants (normals, curvatures, Laplace–Beltrami evaluations) directly from scattered data using level-set methods (Guidotti, 6 Feb 2026).
Local Kernel Framework and Metric Learning
The package's theoretical foundation includes local kernels, which, under appropriate normalization and moment expansion, discretely approximate generators of Itô processes and (symmetrically) the Laplace–Beltrami operator of an induced Riemannian geometry. This structure enables the construction of kernels that are conformally or diffeomorphism invariant and that approximate arbitrary target metrics (Berry et al., 2014).
Geodesic Distance, Flatness, and Positive-Definiteness
Design of kernels based on geodesic distances is governed by deep geometric constraints. A Gaussian kernel of the form is PSD if and only if the underlying space is Alexandrov flat (isometric to Euclidean space). In contrast, the Laplacian kernel is PSD on spaces where the distance is conditionally negative definite (CND), such as spheres and hyperbolic spaces (Feragen et al., 2014).
2. Numerical Methods and Feature Approximations
GeometricKernels implements two main strategies for kernel evaluation on discrete or large-scale data:
- Spectral Expansion: For domains with discrete Laplace spectra (compact manifolds, meshes, graphs), it computes leading eigenpairs of the Laplacian using sparse linear algebra and evaluates kernels by truncating the spectral sum. For operator-theoretic consistency, all kernels are evaluated directly in terms of the eigenbasis.
- Fourier-/Spectral-Feature Approximations: For large or noncompact spaces, the package employs random sampling of Laplacian eigenfunctions (weighted by the spectral profile) to construct finite-dimensional explicit feature maps such that . This reduces GP sampling and kernel-matrix multiplicative complexity from to without loss of PSD properties (Mostowsky et al., 2024).
In discrete domains (meshes, graphs), Laplacians are formed as combinatorial or cotangent-weight matrices, and graph kernels follow the same spectral machinery. All algorithms are implemented with careful normalization, variable-bandwidth extensions, and data-driven density corrections for robustness and statistical consistency (Berry et al., 2014).
3. Supported Spaces, Kernels, and PD Conditions
GeometricKernels exposes a broad suite of domain and kernel classes, each rigorously checking positive-definiteness conditions when constructed. Supported spaces include:
- Riemannian manifolds (e.g., spheres, hyperbolic spaces, special orthogonal groups),
- Discrete meshes (vertex/face),
- Graphs (adjacency-based),
- Product and composite domains.
Kernels include heat, Matérn, geodesic Laplacian (on CND spaces), and product/composite variants. The software enforces that the geodesic Gaussian kernel is only offered for flat spaces (Euclidean, SPD matrices under log-Euclidean metric), explicitly raising exceptions otherwise. Laplacian kernels are made available only on CND metric domains, such as all spheres, hyperbolic spaces, and certain statistical manifolds (Feragen et al., 2014).
The package provides automatic PSD-checking of Gram matrices for user-supplied data, with best-practice warnings when indefinite results arise from geometric obstructions.
4. Software Architecture, API Design, and Backend Support
GeometricKernels is implemented atop a backend-dispatch layer (LAB), supporting NumPy, JAX, PyTorch, and TensorFlow transparently. Core software modules include:
- spaces/: e.g.,
Hypersphere(dim),Hyperbolic(dim),Mesh(vertices,faces),Graph(adjacency) - kernels/:
MaternGeometricKernel(space),ProductGeometricKernel(*kernels) - feature_maps/:
ExactFeatureMap,RandomSpectralMap - sampling/:
sample_gp_prior(kernel, params, x, rng) - frontends/: integration with GPyTorch, GPJax, GPflow for end-to-end GP modeling
Classes such as MaternGeometricKernel offer methods for hyperparameter initialization (init_params), kernel evaluation (K), and feature map generation (feature_map). Differentiation is delegated to the underlying array backend, so gradients through eigen-decompositions and feature samples are usable in arbitrary ML pipelines for hyperparameter optimization and GP marginal likelihood learning (Mostowsky et al., 2024).
5. Applications, Performance, and Scalability
Principal use cases for GeometricKernels include GP regression and Bayesian optimization on non-Euclidean domains, uncertainty quantification on scientific and robotic manifolds, and geometric machine learning for surfaces and networks. The package is benchmarked for:
- Full spectral kernel evaluation up to mesh vertices and graph nodes,
- Efficient batch and large-scale computations via feature approximations,
- Kernel evaluation and sampling scaling linearly in (number of features) and quadratically in number of eigenpairs for full kernel matrices.
Eigen-computation is conducted either densely () or via sparse Lanczos (), with all backends supporting batched kernel evaluations (Mostowsky et al., 2024).
6. Extensions: Geometry Reconstruction, Operators, and Invariance
GeometricKernels generalizes standard kernel regression to unstructured data for geometry processing. Using kernel-based level-set representations, one obtains numerically robust surface reconstruction (as zero/one level sets of ), tangent vectors, principal curvatures (Weingarten map), and meshfree approximations to the Laplace–Beltrami operator. The algorithms leverage explicit formulas for gradients, Hessians, and projections onto tangent spaces, enabling computation of geometric PDE solutions and manifold differential operators directly from scattered or noisy data (Guidotti, 6 Feb 2026).
The local kernel theory allows for construction of kernels that are invariant to sampling density (conformal invariance) or global geometric maps (diffeomorphisms), and gives recipes for synthesizing kernels that induce arbitrary target metrics via choice of anisotropic covariances (Berry et al., 2014).
7. Historical and Theoretical Context
The design of GeometricKernels is informed by advances in spectral theory of diffusion and heat kernels, local kernel theory, and geometry-aware machine learning:
- Diffusion maps and local kernel methods (Berry et al., 2014),
- Theory of positive-definiteness for geodesic exponential kernels (Feragen et al., 2014),
- Spectral Matérn and heat kernels for geometric GP modeling (Mostowsky et al., 2024),
- Meshfree geometry and regularized interpolation using RKHSs (Guidotti, 6 Feb 2026).
The library coherently integrates operator theory, statistical learning, and computational geometry, representing the confluence of geometric data analysis, uncertainty quantification, and scalable machine learning for manifold-valued data.
References:
(Mostowsky et al., 2024) The GeometricKernels Package: Heat and Matérn Kernels for Geometric Learning on Manifolds, Meshes, and Graphs (Guidotti, 6 Feb 2026) Geometric Kernel Interpolation and Regression (Feragen et al., 2014) Geodesic Exponential Kernels: When Curvature and Linearity Conflict (Berry et al., 2014) Local Kernels and the Geometric Structure of Data