$\ell_1$-Regularized Generalized Least Squares (2405.10719v1)
Abstract: In this paper we propose an $\ell_1$-regularized GLS estimator for high-dimensional regressions with potentially autocorrelated errors. We establish non-asymptotic oracle inequalities for estimation accuracy in a framework that allows for highly persistent autoregressive errors. In practice, the Whitening matrix required to implement the GLS is unkown, we present a feasible estimator for this matrix, derive consistency results and ultimately show how our proposed feasible GLS can recover closely the optimal performance (as if the errors were a white noise) of the LASSO. A simulation study verifies the performance of the proposed method, demonstrating that the penalized (feasible) GLS-LASSO estimator performs on par with the LASSO in the case of white noise errors, whilst outperforming it in terms of sign-recovery and estimation error when the errors exhibit significant correlation.
- Regularized estimation in sparse high-dimensional time series models. The Annals of Statistics, 43(4):1535–1567.
- Exponential inequalities for self-normalized martingales with applications. The Annals of Applied Probability, 18(5):1848–1869.
- Some comments on a paper of coen, gomme and kendall. Journal of the Royal Statistical Society Series A: Statistics in Society, 134(2):229–240.
- Statistics for high-dimensional data: methods, theory and applications. Springer Science & Business Media.
- High dimensional generalised penalised least squares. arXiv.
- Granger, C. W. (1981). Some properties of time series data and their use in econometric model specification. Journal of econometrics, 16(1):121–130.
- Spurious regressions in econometrics. Journal of econometrics, 2(2):111–120.
- The elements of statistical learning: data mining, inference, and prediction, volume 2. Springer.
- Preconditioning the Lasso for sign consistency. Electronic Journal of Statistics, 9:1150–1172.
- Generalized Least Squares, volume 7. Wiley.
- Kock, A. B. (2014). Consistent and Conservative Model Selection with the Adaptive Lasso in Stationary and Nonstationary Autoregressions. Econometric Theory, 32(1):243–259.
- Generalized Least Squares with Misspecified Serial Correlation Structures. Journal of the Royal Statistical Society Series B, 63(3):515–531.
- ℓ1ℓ1\ell 1roman_ℓ 1-regularization of high-dimensional time-series models with non-Gaussian and heteroskedastic errors. Journal of Econometrics, 191(1):255–271.
- A Unified Framework for High-Dimensional Analysis of M-Estimators with Decomposable Regularizers. Statistical Science, 27(4):538–557.
- Self-normalized processes: Limit theory and Statistical Applications. Springer.
- Phillips, P. C. (1987). Towards a unified asymptotic theory for autoregression. Biometrika, 74(3):535–547.
- Restricted eigenvalue properties for correlated gaussian designs. The Journal of Machine Learning …, 11:2241–2259.
- Vershynin, R. (2018). High-dimensional probability: An introduction with applications in data science, volume 47. Cambridge university press.
- The adaptive lasso in high-dimensional sparse heteroscedastic models. Mathematical Methods of Statistics, 22(2):137–154.
- Wainwright, M. J. (2009). Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using l1-Constrained Quadratic Programming (Lasso). Information Theory, IEEE Transactions on, 55(5):2183–2202.
- Wainwright, M. J. (2019). High-dimensional statistics: A non-asymptotic viewpoint, volume 48. Cambridge University Press.
- White, H. (2014). Asymptotic theory for econometricians. Academic press.
- Ziel, F. (2016). Iteratively reweighted adaptive lasso for conditional heteroscedastic time series with applications to AR-ARCH type processes. Computational Statistics and Data Analysis, 100:773–793.