Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
134 tokens/sec
GPT-4o
10 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Tunable robustness in power-law inference (2301.05690v1)

Published 13 Jan 2023 in stat.ME

Abstract: Power-law probability distributions arise often in the social and natural sciences. Statistics have been developed for estimating the exponent parameter as well as gauging goodness-of-fit to a power law. Yet paradoxically, many famous power laws such as the distribution of wealth and earthquake magnitudes have not found good statistical support in data by modern methods. We show that measurement errors such as quantization and noise bias both maximum-likelihood estimators and goodness-of-fit measures. We address this issue using logarithmic binning and the corresponding discrete reference distribution for maximum likelihood estimators and Kolmogorov-Smirnov statistics. Using simulated errors, we validate that binning attenuates bias in parameter estimates and recalibrates goodness of fit to a power law by removing small errors from consideration. These benefits come at modest cost in statistical power, which can be compensated with larger sample sizes. We reanalyse three empirical cases of wealth, earthquake magnitudes and wildfire area and show that binning reverses statistical conclusions and aligns the statistical results with historical and scientific expectations. We explain through these cases how routine errors lead to incorrect conclusions and the necessity for more robust methods.

Citations (2)

Summary

We haven't generated a summary for this paper yet.