VCformer: Variable Correlation Transformer with Inherent Lagged Correlation for Multivariate Time Series Forecasting (2405.11470v1)
Abstract: Multivariate time series (MTS) forecasting has been extensively applied across diverse domains, such as weather prediction and energy consumption. However, current studies still rely on the vanilla point-wise self-attention mechanism to capture cross-variable dependencies, which is inadequate in extracting the intricate cross-correlation implied between variables. To fill this gap, we propose Variable Correlation Transformer (VCformer), which utilizes Variable Correlation Attention (VCA) module to mine the correlations among variables. Specifically, based on the stochastic process theory, VCA calculates and integrates the cross-correlation scores corresponding to different lags between queries and keys, thereby enhancing its ability to uncover multivariate relationships. Additionally, inspired by Koopman dynamics theory, we also develop Koopman Temporal Detector (KTD) to better address the non-stationarity in time series. The two key components enable VCformer to extract both multivariate correlations and temporal dependencies. Our extensive experiments on eight real-world datasets demonstrate the effectiveness of VCformer, achieving top-tier performance compared to other state-of-the-art baseline models. Code is available at this repository: https://github.com/CSyyn/VCformer.
- P. A. Blight and C. Chatfield. The analysis of time series: An introduction. 4th edn. Applied Statistics, 40(1):178, Jan 1991.
- Language models are few-shot learners. In Proceedings of the 34th International Conference on Neural Information Processing Systems, Red Hook, NY, USA, 2020. Curran Associates Inc.
- Modern koopman theory for dynamical systems. SIAM Review, page 229–340, May 2022.
- A new hybrid method for predicting univariate and multivariate time series based on pattern forecasting. Inf. Sci., 586(C):611–627, mar 2022.
- Lag penalized weighted correlation for time series clustering. BMC bioinformatics, 21:1–15, 2020.
- The analysis of time series: an introduction with R. CRC press, 2019.
- Multivariate time-series anomaly detection using seqvae-cnn hybrid model. In 2022 International Conference on Information Networking (ICOIN), pages 250–253, Jan 2022.
- Compressed interaction graph based framework for multi-behavior recommendation. In Proceedings of the ACM Web Conference 2023, pages 960–970, 2023.
- Deep learning of koopman representation for control. In 2020 59th IEEE Conference on Decision and Control (CDC), pages 1890–1895. IEEE, 2020.
- The capacity and robustness trade-off: Revisiting the channel independent strategy for multivariate time series forecasting. arXiv preprint arXiv:2304.05206, 2023.
- Kurt Hornik. Approximation capabilities of multilayer feedforward networks. Neural Networks, page 251–257, Jan 1991.
- Detecting time lag between a pair of time series using visibility graph algorithm. Communications in Statistics: Case Studies, Data Analysis and Applications, 7(3):315–343, Jul 2021.
- Bernard O Koopman. Hamiltonian systems and transformation in hilbert space. Proceedings of the National Academy of Sciences, 17(5):315–318, 1931.
- Modeling long-and short-term temporal patterns with deep neural networks. In The 41st international ACM SIGIR conference on research & development in information retrieval, pages 95–104, 2018.
- Deep learning nonlinear multiscale dynamic problems using koopman operator. Journal of Computational Physics, page 110660, Dec 2021.
- Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting. arXiv: Learning,arXiv: Learning, Jun 2019.
- Learning compositional koopman operators for model-based control. arXiv preprint arXiv:1910.08264, 2019.
- Revisiting long-term time series forecasting: An investigation on linear mapping. arXiv preprint arXiv:2305.10721, 2023.
- Pyraformer: Low-complexity pyramidal attention for long-range time series modeling and forecasting. In International Conference on Learning Representations, 2022.
- Non-stationary transformers: Exploring the stationarity in time series forecasting. Advances in Neural Information Processing Systems, 35:9881–9893, 2022.
- itransformer: Inverted transformers are effective for time series forecasting. arXiv preprint arXiv:2310.06625, 2023.
- Koopa: Learning non-stationary time series dynamics with koopman predictors. In Thirty-seventh Conference on Neural Information Processing Systems, 2023.
- Deep learning for universal linear embeddings of nonlinear dynamics. Nature Communications, Nov 2018.
- Deep variational koopman models: Inferring koopman observations for uncertainty-aware dynamics modeling and control. arXiv preprint arXiv:1902.09742, 2019.
- A time series is worth 64 words: Long-term forecasting with transformers. In The Eleventh International Conference on Learning Representations, 2023.
- Deepar: Probabilistic forecasting with autoregressive recurrent networks. International Journal of Forecasting, 36(3):1181–1191, 2020.
- Dynamic mode decomposition of numerical and experimental data. Bulletin of the American Physical Society,Bulletin of the American Physical Society, Nov 2008.
- Think globally, act locally: A deep neural network approach to high-dimensional time series forecasting. Advances in neural information processing systems, 32, 2019.
- Decoupled dynamic spatial-temporal graph neural network for traffic forecasting. Proceedings of the VLDB Endowment, 15(11):2733–2746, Jul 2022.
- Chenhua Shen. Analysis of detrended time-lagged cross-correlation between two nonstationary time series. Physics Letters A, 379(7):680–687, Mar 2015.
- Attention is all you need. In Proceedings of the 31st International Conference on Neural Information Processing Systems, page 6000–6010, Red Hook, NY, USA, 2017. Curran Associates Inc.
- Ai-enhanced spatial-temporal data-mining technology: New chance for next-generation urban computing. The Innovation, 4(2):100405, Mar 2023.
- Transformers in time series: A survey. In Edith Elkind, editor, Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence, IJCAI-23, pages 6778–6786, August 2023.
- Norbert Wiener. Generalized harmonic analysis. Acta mathematica, 55(1):117–258, 1930.
- A data–driven approximation of the koopman operator: Extending dynamic mode decomposition. Journal of Nonlinear Science, page 1307–1346, Dec 2015.
- Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting. In Advances in Neural Information Processing Systems, volume 34, pages 22419–22430, 2021.
- Timesnet: Temporal 2d-variation modeling for general time series analysis. In The Eleventh International Conference on Learning Representations, 2023.
- Dsformer: A double sampling transformer for multivariate time series long-term prediction. In Proceedings of the 32nd ACM International Conference on Information and Knowledge Management, pages 3062–3072, 2023.
- Are transformers effective for time series forecasting? Proceedings of the AAAI Conference on Artificial Intelligence, 37(9):11121–11128, Jun. 2023.
- Crossformer: Transformer utilizing cross-dimension dependency for multivariate time series forecasting. In The Eleventh International Conference on Learning Representations, 2023.
- Informer: Beyond efficient transformer for long sequence time-series forecasting. Proceedings of the AAAI Conference on Artificial Intelligence, page 11106–11115, Sep 2022.
- Fedformer: Frequency enhanced decomposed transformer for long-term series forecasting. In International Conference on Machine Learning, pages 27268–27286. PMLR, 2022.