Nonparametric estimates of low bias (1008.0127v1)
Abstract: We consider the problem of estimating an arbitrary smooth functional of $k \geq 1 $ distribution functions (d.f.s.) in terms of random samples from them. The natural estimate replaces the d.f.s by their empirical d.f.s. Its bias is generally $\sim n{-1}$, where $n$ is the minimum sample size, with a {\it $p$th order} iterative estimate of bias $ \sim n{-p}$ for any $p$. For $p \leq 4$, we give an explicit estimate in terms of the first $2p - 2$ von Mises derivatives of the functional evaluated at the empirical d.f.s. These may be used to obtain {\it unbiased} estimates, where these exist and are of known form in terms of the sample sizes; our form for such unbiased estimates is much simpler than that obtained using polykays and tables of the symmetric functions. Examples include functions of a mean vector (such as the ratio of two means and the inverse of a mean), standard deviation, correlation, return times and exceedances. These $p$th order estimates require only $\sim n $ calculations. This is in sharp contrast with computationally intensive bias reduction methods such as the $p$th order bootstrap and jackknife, which require $\sim np $ calculations.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.