Principles of Statistical Inference in Online Problems (2209.05399v2)
Abstract: To investigate a dilemma of statistical and computational efficiency faced by long-run variance estimators, we propose a decomposition of kernel weights in a quadratic form and some online inference principles. These proposals allow us to characterize efficient online long-run variance estimators. Our asymptotic theory and simulations show that this principle-driven approach leads to online estimators with a uniformly lower mean squared error than all existing works. We also discuss practical enhancements such as mini-batch and automatic updates to handle fast streaming data and optimal parameters tuning. Beyond variance estimation, we consider the proposals in the context of online quantile regression, online change point detection, Markov chain Monte Carlo convergence diagnosis, and stochastic approximation. Substantial improvements in computational cost and finite-sample statistical properties are observed when we apply our principle-driven variance estimator to original and modified inference procedures.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.