2000 character limit reached
Strongly Consistent of Kullback-Leibler Divergence Estimator and Tests for Model Selection Based on a Bias Reduced Kernel Density Estimator (1805.07088v1)
Published 18 May 2018 in stat.ME
Abstract: In this paper, we study the strong consistency of a bias reduced kernel density estimator and derive a strongly con- sistent Kullback-Leibler divergence (KLD) estimator. As application, we formulate a goodness-of-fit test and an asymptotically standard normal test for model selection. The Monte Carlo simulation show the effectiveness of the proposed estimation methods and statistical tests.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.