Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 70 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 27 tok/s Pro
GPT-5 High 24 tok/s Pro
GPT-4o 75 tok/s Pro
Kimi K2 175 tok/s Pro
GPT OSS 120B 447 tok/s Pro
Claude Sonnet 4 36 tok/s Pro
2000 character limit reached

An Information-Theoretic Approach to Nonparametric Estimation, Model Selection, and Goodness of Fit (1103.4890v1)

Published 25 Mar 2011 in math.ST, stat.ME, and stat.TH

Abstract: This paper applies the recently axiomatized Optimum Information Principle (minimize the Kullback-Leibler information subject to all relevant information) to nonparametric density estimation, which provides a theoretical foundation as well as a computational algorithm for maximum entropy density estimation. The estimator, called optimum information estimator, approximates the true density arbitrarily well. As a by-product I obtain a measure of goodness of fit of parametric models (both conditional and unconditional) and an absolute criterion for model selection, as opposed to other conventional methods such as AIC and BIC which are relative measures.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.