Papers
Topics
Authors
Recent
2000 character limit reached

Tunable robustness in power-law inference

Published 13 Jan 2023 in stat.ME | (2301.05690v1)

Abstract: Power-law probability distributions arise often in the social and natural sciences. Statistics have been developed for estimating the exponent parameter as well as gauging goodness-of-fit to a power law. Yet paradoxically, many famous power laws such as the distribution of wealth and earthquake magnitudes have not found good statistical support in data by modern methods. We show that measurement errors such as quantization and noise bias both maximum-likelihood estimators and goodness-of-fit measures. We address this issue using logarithmic binning and the corresponding discrete reference distribution for maximum likelihood estimators and Kolmogorov-Smirnov statistics. Using simulated errors, we validate that binning attenuates bias in parameter estimates and recalibrates goodness of fit to a power law by removing small errors from consideration. These benefits come at modest cost in statistical power, which can be compensated with larger sample sizes. We reanalyse three empirical cases of wealth, earthquake magnitudes and wildfire area and show that binning reverses statistical conclusions and aligns the statistical results with historical and scientific expectations. We explain through these cases how routine errors lead to incorrect conclusions and the necessity for more robust methods.

Citations (2)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.