Minimax Optimal Estimators for Additive Scalar Functionals of Discrete Distributions (1701.06381v3)
Abstract: In this paper, we consider estimators for an additive functional of $\phi$, which is defined as $\theta(P;\phi)=\sum_{i=1}k\phi(p_i)$, from $n$ i.i.d. random samples drawn from a discrete distribution $P=(p_1,...,p_k)$ with alphabet size $k$. We propose a minimax optimal estimator for the estimation problem of the additive functional. We reveal that the minimax optimal rate is characterized by the divergence speed of the fourth derivative of $\phi$ if the divergence speed is high. As a result, we show there is no consistent estimator if the divergence speed of the fourth derivative of $\phi$ is larger than $p{-4}$. Furthermore, if the divergence speed of the fourth derivative of $\phi$ is $p{4-\alpha}$ for $\alpha \in (0,1)$, the minimax optimal rate is obtained within a universal multiplicative constant as $\frac{k2}{(n\ln n){2\alpha}} + \frac{k{2-2\alpha}}{n}$.
- Kazuto Fukuchi (27 papers)
- Jun Sakuma (46 papers)