Theory and Applications of Kernel Stein's Method on Riemannian Manifolds (2501.00695v2)
Abstract: Distributional comparison is a fundamental problem in statistical data analysis with numerous applications in a variety of scientific and engineering fields. Numerous methods exist for distributional comparison but kernel Stein's method has gained significant popularity in recent times. In this paper, we first present a novel mathematically rigorous and consistent generalization of the Stein operator to Riemannian manifolds. Then we show that the kernel Stein discrepancy (KSD) defined via this operator is nearly as strong as the KSD in the Euclidean setting in terms of distinguishing the target distributions from the reference. We investigate the asymptotic properties of the minimum kernel Stein discrepancy estimator (MKSDE), apply it to goodness-of-fit testing, and compare it to the maximum likelihood estimator (MLE) experimentally. We present several examples of our theory applied to commonly encountered Riemannian manifolds in practice namely, the n-sphere, the Grassmann, Stiefel, the manifold of symmetric positive definite matrices and other Riemannian homogeneous spaces. On the aforementioned manifolds, we consider a variety of distributions with intractable normalization constants and derive closed form expressions for the KSD and MKSDE.