2000 character limit reached
A simple application of FIC to model selection
Published 19 Jun 2015 in physics.data-an, cs.LG, and stat.ML | (1506.06129v1)
Abstract: We have recently proposed a new information-based approach to model selection, the Frequentist Information Criterion (FIC), that reconciles information-based and frequentist inference. The purpose of this current paper is to provide a simple example of the application of this criterion and a demonstration of the natural emergence of model complexities with both AIC-like ($N0$) and BIC-like ($\log N$) scaling with observation number $N$. The application developed is deliberately simplified to make the analysis analytically tractable.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.