Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
162 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Optimal-order bounds on the rate of convergence to normality for maximum likelihood estimators (1601.02177v3)

Published 10 Jan 2016 in math.ST and stat.TH

Abstract: It is well known that under general regularity conditions the distribution of the maximum likelihood estimator (MLE) is asymptotically normal. Very recently, bounds of the optimal order $O(1/\sqrt n)$ on the closeness of the distribution of the MLE to normality in the so-called bounded Wasserstein distance were obtained, where $n$ is the sample size. However, the corresponding bounds on the Kolmogorov distance were only of the order $O(1/n{1/4})$. In this note, bounds of the optimal order $O(1/\sqrt n)$ on the closeness of the distribution of the MLE to normality in the Kolmogorov distance are given, as well as their nonuniform counterparts, which work better for large deviations of the MLE. These results are based on previously obtained general optimal-order bounds on the rate of convergence to normality in the multivariate delta method. The crucial observation is that, under natural conditions, the MLE can be tightly enough bracketed between two smooth enough functions of the sum of independent random vectors, which makes the delta method applicable.

Summary

We haven't generated a summary for this paper yet.