Papers
Topics
Authors
Recent
Search
2000 character limit reached

Robust Estimation of Sparse, High Dimensional Time Series with Polynomial Tails

Published 14 Nov 2022 in math.ST and stat.TH | (2211.07558v1)

Abstract: High dimensional Vector Autoregressions (VAR) have received a lot of interest recently due to novel applications in health, engineering, finance and the social sciences. Three issues arise when analyzing VAR's: (a) The high dimensional nature of the model in the presence of many time series that poses challenges for consistent estimation of its parameters; (b) the presence of temporal dependence introduces additional challenges for theoretical analysis of various estimation procedures; (b) the presence of heavy tails in a number of applications. Recent work, e.g. [Basu and Michailidis, 2015],[Kock and Callot,2015], has addressed consistent estimation of sparse high dimensional, stable Gaussian VAR models based on an $\ell_1$ LASSO procedure. Further, the rates obtained are optimal, in the sense that they match those for iid data, plus a multiplicative factor (which is the "price" paid) for temporal dependence. However, the third issue remains unaddressed in extant literature. This paper extends existing results in the following important direction: it considers consistent estimation of the parameters of sparse high dimensional VAR models driven by heavy tailed homoscedastic or heteroskedastic noise processes (that do not possess all moments). A robust penalized approach (e.g., LASSO) is adopted for which optimal consistency rates and corresponding finite sample bounds for the underlying model parameters are obtain that match those for iid data, albeit paying a price for temporal dependence. The theoretical results are illustrated on VAR models and also on other popular time series models. Notably, the key technical tool used, is a single concentration bound for heavy tailed dependent processes.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.