An information criterion for model selection with missing data via complete-data divergence (1509.02870v4)
Abstract: We derive an information criterion to select a parametric model of complete-data distribution when only incomplete or partially observed data is available. Compared with AIC, our new criterion has an additional penalty term for missing data, which is expressed by the Fisher information matrices of complete data and incomplete data. We prove that our criterion is an asymptotically unbiased estimator of complete-data divergence, namely, the expected Kullback-Leibler divergence between the true distribution and the estimated distribution for complete data, whereas AIC is that for the incomplete data. Information criteria PDIO (Shimodaira 1994) and AICcd (Cavanaugh and Shumway 1998) have been previously proposed to estimate complete-data divergence, and they have the same penalty term. The additional penalty term of our criterion for missing data turns out to be only half the value of that in PDIO and AICcd. The difference in the penalty term is attributed to the fact that our criterion is derived under a weaker assumption. A simulation study with the weaker assumption shows that our criterion is unbiased while the other two criteria are biased. In addition, we review the geometrical view of alternating minimizations of the EM algorithm. This geometrical view plays an important role in deriving our new criterion.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.