Generalized maximum likelihood estimation of the mean of parameters of mixtures, with applications to sampling
Abstract: Let $f(y|\theta), \; \theta \in \Omega$ be a parametric family, $\eta(\theta)$ a given function, and $G$ an unknown mixing distribution. It is desired to estimate $E_G (\eta(\theta))\equiv \eta_G$ based on independent observations $Y_1,...,Y_n$, where $Y_i \sim f(y|\theta_i)$, and $\theta_i \sim G$ are iid. We explore the Generalized Maximum Likelihood Estimators (GMLE) for this problem. Some basic properties and representations of those estimators are shown. In particular we suggest a new perspective, of the weak convergence result by Kiefer and Wolfowitz (1956), with implications to a corresponding setup in which $\theta_1,...,\theta_n$ are {\it fixed} parameters. We also relate the above problem, of estimating $\eta_G$, to non-parametric empirical Bayes estimation under a squared loss. Applications of GMLE to sampling problems are presented. The performance of the GMLE is demonstrated both in simulations and through a real data example.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.