Empirical Survival Jensen-Shannon Divergence as a Goodness-of-Fit Measure for Maximum Likelihood Estimation and Curve Fitting (1809.11052v5)
Abstract: The coefficient of determination, known as $R2$, is commonly used as a goodness-of-fit criterion for fitting linear models. $R2$ is somewhat controversial when fitting nonlinear models, although it may be generalised on a case-by-case basis to deal with specific models such as the logistic model. Assume we are fitting a parametric distribution to a data set using, say, the maximum likelihood estimation method. A general approach to measure the goodness-of-fit of the fitted parameters, which is advocated herein, is to use a nonparametric measure for comparison between the empirical distribution, comprising the raw data, and the fitted model. In particular, for this purpose we put forward the Survival Jensen-Shannon divergence ($SJS$) and its empirical counterpart (${\cal E}SJS$) as a metric which is bounded, and is a natural generalisation of the Jensen-Shannon divergence. We demonstrate, via a straightforward procedure making use of the ${\cal E}SJS$, that it can be used as part of maximum likelihood estimation or curve fitting as a measure of goodness-of-fit, including the construction of a confidence interval for the fitted parametric distribution. Furthermore, we show the validity of the proposed method with simulated data, and three empirical data sets.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.