Papers
Topics
Authors
Recent
Search
2000 character limit reached

Empirical Survival Jensen-Shannon Divergence as a Goodness-of-Fit Measure for Maximum Likelihood Estimation and Curve Fitting

Published 28 Sep 2018 in stat.ME, physics.data-an, and q-fin.ST | (1809.11052v5)

Abstract: The coefficient of determination, known as $R2$, is commonly used as a goodness-of-fit criterion for fitting linear models. $R2$ is somewhat controversial when fitting nonlinear models, although it may be generalised on a case-by-case basis to deal with specific models such as the logistic model. Assume we are fitting a parametric distribution to a data set using, say, the maximum likelihood estimation method. A general approach to measure the goodness-of-fit of the fitted parameters, which is advocated herein, is to use a nonparametric measure for comparison between the empirical distribution, comprising the raw data, and the fitted model. In particular, for this purpose we put forward the Survival Jensen-Shannon divergence ($SJS$) and its empirical counterpart (${\cal E}SJS$) as a metric which is bounded, and is a natural generalisation of the Jensen-Shannon divergence. We demonstrate, via a straightforward procedure making use of the ${\cal E}SJS$, that it can be used as part of maximum likelihood estimation or curve fitting as a measure of goodness-of-fit, including the construction of a confidence interval for the fitted parametric distribution. Furthermore, we show the validity of the proposed method with simulated data, and three empirical data sets.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.