Limit distribution theory for block estimators in multiple isotonic regression (1905.12825v2)
Abstract: We study limit distributions for the tuning-free max-min block estimator originally proposed in [FLN17] in the problem of multiple isotonic regression, under both fixed lattice design and random design settings. We show that, if the regression function $f_0$ admits vanishing derivatives up to order $\alpha_k$ along the $k$-th dimension ($k=1,\ldots,d$) at a fixed point $x_0 \in (0,1)d$, and the errors have variance $\sigma2$, then the max-min block estimator $\hat{f}n$ satisfies \begin{align*} (n\ast/\sigma2){\frac{1}{2+\sum_{k \in \mathcal{D}\ast} \alpha_k{-1}}}\big(\hat{f}_n(x_0)-f_0(x_0)\big)\rightsquigarrow \mathbb{C}(f_0,x_0). \end{align*} Here $\mathcal{D}\ast, n_\ast$, depending on ${\alpha_k}$ and the design points, are the set of all effective dimensions' and the size of
effective samples' that drive the asymptotic limiting distribution, respectively. If furthermore either ${\alpha_k}$ are relative primes to each other or all mixed derivatives of $f_0$ of certain critical order vanish at $x_0$, then the limiting distribution can be represented as $\mathbb{C}(f_0,x_0) =d K(f_0,x_0) \cdot \mathbb{D}{\alpha}$, where $K(f_0,x_0)$ is a constant depending on the local structure of the regression function $f_0$ at $x_0$, and $\mathbb{D}_{\alpha}$ is a non-standard limiting distribution generalizing the well-known Chernoff distribution in univariate problems. The above limit theorem is also shown to be optimal both in terms of the local rate of convergence and the dependence on the unknown regression function whenever such dependence is explicit (i.e. $K(f_0,x_0)$), for the full range of ${\alpha_k}$ in a local asymptotic minimax sense.