Convergence of Lipschitz constants for random feature maps

Determine whether the Lipschitz constant Lip(θ_N) of the N-dimensional random feature map θ_N(x) = (1/√N)[φ(ω_1, x), …, φ(ω_N, x)], where ω_i are i.i.d. with distribution P and k(x, x') = E_P[φ(ω, x) φ(ω, x')], converges, in probability or almost surely as N→∞, to the Lipschitz constant Lip(ϕ) of the infinite-dimensional RKHS feature map ϕ: x ↦ k(·, x).

Background

The paper analyzes exact Lipschitz constants of feature maps associated with integral kernels, providing closed-form expressions and necessary-and-sufficient conditions under differentiability assumptions. In practice, such kernels are often approximated by finite random features, yielding an empirical kernel k_N that converges to the integral kernel k.

To study robustness of finite-width approximations, the authors define the random feature map θ_N built from i.i.d. samples ω_i~P and consider whether its Lipschitz constant approaches that of the infinite-dimensional feature map ϕ(x)=k(·,x). Establishing this convergence would link the robustness of finite random-feature models to that of the limiting kernel representation. The paper presents numerical evidence for Gaussian random Fourier features, ReLU neural networks, and Matérn kernels, motivating the formal resolution of this convergence question.

References

Open question. Does Lip(θ_N) converge to Lip(ϕ), in probability or almost surely, as N→∞?

Lipschitz bounds for integral kernels  (2604.02887 - Reverdi et al., 3 Apr 2026) in Section 5, Numerical illustration and an open question on finite random features