Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 152 tok/s
Gemini 2.5 Pro 54 tok/s Pro
GPT-5 Medium 25 tok/s Pro
GPT-5 High 30 tok/s Pro
GPT-4o 101 tok/s Pro
Kimi K2 203 tok/s Pro
GPT OSS 120B 431 tok/s Pro
Claude Sonnet 4.5 26 tok/s Pro
2000 character limit reached

Statistical inference of high-dimensional vector autoregressive time series with non-i.i.d. innovations (2310.07364v1)

Published 11 Oct 2023 in math.ST, stat.ME, and stat.TH

Abstract: Independent or i.i.d. innovations is an essential assumption in the literature for analyzing a vector time series. However, this assumption is either too restrictive for a real-life time series to satisfy or is hard to verify through a hypothesis test. This paper performs statistical inference on a sparse high-dimensional vector autoregressive time series, allowing its white noise innovations to be dependent, even non-stationary. To achieve this goal, it adopts a post-selection estimator to fit the vector autoregressive model and derives the asymptotic distribution of the post-selection estimator. The innovations in the autoregressive time series are not assumed to be independent, thus making the covariance matrices of the autoregressive coefficient estimators complex and difficult to estimate. Our work develops a bootstrap algorithm to facilitate practitioners in performing statistical inference without having to engage in sophisticated calculations. Simulations and real-life data experiments reveal the validity of the proposed methods and theoretical results. Real-life data is rarely considered to exactly satisfy an autoregressive model with independent or i.i.d. innovations, so our work should better reflect the reality compared to the literature that assumes i.i.d. innovations.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.