Learning Theory for Estimation of Animal Motion Submanifolds (2003.13811v2)
Abstract: This paper describes the formulation and experimental testing of a novel method for the estimation and approximation of submanifold models of animal motion. It is assumed that the animal motion is supported on a configuration manifold $Q$ that is a smooth, connected, regularly embedded Riemannian submanifold of Euclidean space $X\approx \mathbb{R}d$ for some $d>0$, and that the manifold $Q$ is homeomorphic to a known smooth, Riemannian manifold $S$. Estimation of the manifold is achieved by finding an unknown mapping $\gamma:S\rightarrow Q\subset X$ that maps the manifold $S$ into $Q$. The overall problem is cast as a distribution-free learning problem over the manifold of measurements $\mathbb{Z}=S\times X$. That is, it is assumed that experiments generate a finite sets ${(s_i,x_i)}{i=1}m\subset \mathbb{Z}m$ of samples that are generated according to an unknown probability density $\mu$ on $\mathbb{Z}$. This paper derives approximations $\gamma{n,m}$ of $\gamma$ that are based on the $m$ samples and are contained in an $N(n)$ dimensional space of approximants. The paper defines sufficient conditions that shows that the rates of convergence in $L2_\mu(S)$ correspond to those known for classical distribution-free learning theory over Euclidean space. Specifically, the paper derives sufficient conditions that guarantee rates of convergence that have the form $$\mathbb{E} \left (|\gamma_\muj-\gamma_{n,m}j|{L2\mu(S)}2\right )\leq C_1 N(n){-r} + C_2 \frac{N(n)\log(N(n))}{m}$$for constants $C_1,C_2$ with $\gamma_\mu:={\gamma1_\mu,\ldots,\gammad_\mu}$ the regressor function $\gamma_\mu:S\rightarrow Q\subset X$ and $\gamma_{n,m}:={\gamma1_{n,j},\ldots,\gammad_{n,m}}$.