RBF-Lifted Signature Kernel
- RBF-lifted signature kernel is a universal, positive definite kernel that lifts continuous paths into a reproducing kernel Hilbert space using Gaussian RBF and signature transforms.
- It employs random Fourier feature approximations to efficiently scale the computation of signature kernels while ensuring uniform error bounds under sub-Gaussian assumptions.
- Variants like Diagonal Projection and Tensor Random Projection provide effective trade-offs, making the approach practical for large-scale time-series and sequence analysis.
The RBF-lifted signature kernel is a universal and characteristic positive definite kernel on the space of continuous paths, combining the expressivity of tensor algebra signatures with the nonlinear similarity afforded by the Gaussian radial basis function (RBF). It measures path similarity by lifting Euclidean increments into a reproducing kernel Hilbert space (RKHS), applying the signature transformation, and computing the Hilbert-Schmidt inner product in the resulting tensor algebra. Recent research demonstrates both explicit constructions and scalable random-feature approximations for this kernel, enabling efficient application to large-scale sequence and time-series analysis tasks (Toth et al., 2023, Piatti et al., 29 Dec 2025).
1. Mathematical Definition and Construction
Let be continuous paths of finite -variation. The full signature of over is given by
where
For the RBF-lifted signature kernel,
- The static RBF kernel is .
- Its RKHS, , is , with being the spectral measure associated to via Bochner’s theorem.
- The feature map is .
To construct the kernel,
- Lift the path pointwise into via ,
- Compute the signature in ,
- The RBF-lifted signature kernel is
This construction is universal and characteristic on path space and possesses invariance and stability properties inherited from both the signature and the RBF kernel (Piatti et al., 29 Dec 2025).
2. Random Fourier Signature Feature Approximations
Direct computation of is infeasible for long paths due to exponential growth in tensor dimensions. Random Fourier feature-based acceleration replaces the static embedding by randomized mappings to approximate the inner products.
- Draw i.i.d. samples .
- The RFF map is:
- In signature computations, replace kernel embeddings with at each step and truncate at signature level .
The resultant random feature signature map is: whose inner product yields an unbiased estimator for the truncated RBF-lifted signature kernel: The expected value of this estimator is exactly the truncated signature kernel, (Toth et al., 2023).
3. Uniform Approximation Guarantees
Concentration inequalities bound the uniform error of the RFF-accelerated kernel over compact path spaces.
For compact convex with paths of bounded $1$-variation , under sub-Gaussian moment bounds on the RFF distribution, there exist constants such that for each signature level and error : with precise bounds detailed in Theorem 3.1 and equation (3.13) (Toth et al., 2023). To guarantee uniform error for all with probability , one requires RFF draws.
4. Computational Complexity and Scalable Variants
The exact signature kernel (e.g., as in Király–Oberhauser 2019) incurs time for paths of length and truncation . RBF-lifted signature kernels with RFF approximation scale as: eliminating the quadratic dependence in both dataset size and sequence length (Toth et al., 2023).
To further improve scalability, two variants are introduced:
| Variant | Feature Dimension | Time Complexity |
|---|---|---|
| Diagonal Projection (DP) | (per-level) | |
| Tensor Random Projection (TRP) |
- DP averages only the diagonal of the RFF tensor product, reducing features at the expense of slower concentration.
- TRP sketches each tensor by a CP-rank-1 Gaussian map, yielding sub-exponential convergence with respect to the number of projections.
Empirical results demonstrate negligible accuracy loss for moderate and —with both DP and TRP matching the full signature kernel on time series benchmarks and scaling efficiently to sequences (Toth et al., 2023).
5. Dynamic Lifting via RF-CDE
A complementary approach uses dynamic random-feature reservoirs via random Fourier controlled differential equations (RF-CDEs):
- At each time , map to RFFs as .
- Feed this lifted signal to a random linear CDE: yielding a feature vector . Only a linear readout on top of is trained.
In the infinite-width limit , , rigorously establishing that RF-CDEs realize the RBF-lifted signature kernel as their limiting covariance (Piatti et al., 29 Dec 2025).
6. Theoretical Properties
is positive definite (Mercer kernel), universal, and characteristic. Universality and characteristicness are inherited from the classical signature kernel and the RBF kernel on (Piatti et al., 29 Dec 2025).
- Reparameterization invariance results from the time invariance of path signatures.
- Stability under -variation: small perturbations in path yield small changes in .
- Equivariance with respect to space translation and rotation is inherited from the RBF base kernel.
This kernel provides a continuous-time, non-Euclidean analogue of RBF feature learning in sequence learning.
7. Practical Guidelines and Related Architectures
Selection of hyperparameters follows empirical trade-offs:
- suffices for kernel error.
- Use the Gaussian spectral measure or improvements such as quasi-Monte Carlo or orthogonal realizations.
- Truncation captures principal path interactions with exponential cost beyond.
- For large-dimensional data and small , DP is recommended; for moderate and , TRP reduces memory; vanilla RFSF has superior concentration but balloons exponentially in .
The rough signature kernel is the limiting case when no nonlinear RBF warping is applied, corresponding to linear (log-)signature propagation in a random reservoir (Piatti et al., 29 Dec 2025). RF-CDE and R-RDE offer two complementary methods whose infinite-width limits recover the RBF-lifted and rough signature kernels, respectively, unifying perspectives on random-feature reservoirs and continuous-time deep sequence models.
References
- "Random Fourier Signature Features" (Toth et al., 2023)
- "Random Controlled Differential Equations" (Piatti et al., 29 Dec 2025)