Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

MCformer: Multivariate Time Series Forecasting with Mixed-Channels Transformer (2403.09223v1)

Published 14 Mar 2024 in cs.LG and eess.SP

Abstract: The massive generation of time-series data by largescale Internet of Things (IoT) devices necessitates the exploration of more effective models for multivariate time-series forecasting. In previous models, there was a predominant use of the Channel Dependence (CD) strategy (where each channel represents a univariate sequence). Current state-of-the-art (SOTA) models primarily rely on the Channel Independence (CI) strategy. The CI strategy treats all channels as a single channel, expanding the dataset to improve generalization performance and avoiding inter-channel correlation that disrupts long-term features. However, the CI strategy faces the challenge of interchannel correlation forgetting. To address this issue, we propose an innovative Mixed Channels strategy, combining the data expansion advantages of the CI strategy with the ability to counteract inter-channel correlation forgetting. Based on this strategy, we introduce MCformer, a multivariate time-series forecasting model with mixed channel features. The model blends a specific number of channels, leveraging an attention mechanism to effectively capture inter-channel correlation information when modeling long-term features. Experimental results demonstrate that the Mixed Channels strategy outperforms pure CI strategy in multivariate time-series forecasting tasks.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (40)
  1. R. A. Angryk, P. C. Martens, B. Aydin, D. Kempton, S. S. Mahajan, S. Basodi, A. Ahmadzadeh, X. Cai, S. Filali Boubrahimi, S. M. Hamdi et al., “Multivariate time series dataset for space weather data analytics,” Scientific data, vol. 7, no. 1, p. 227, 2020.
  2. J. Han, H. Liu, H. Zhu, H. Xiong, and D. Dou, “Joint air quality and weather prediction based on multi-adversarial spatiotemporal networks,” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, no. 5, 2021, pp. 4081–4089.
  3. Y. Cheng, S. Wan, K.-K. R. Choo et al., “Deep belief network for meteorological time series prediction in the internet of things,” IEEE internet of things journal, vol. 6, no. 3, pp. 4369–4376, 2018.
  4. B. Ghosh, B. Basu, and M. O’Mahony, “Multivariate short-term traffic flow forecasting using time-series analysis,” IEEE transactions on intelligent transportation systems, vol. 10, no. 2, pp. 246–254, 2009.
  5. R.-G. Cirstea, B. Yang, C. Guo, T. Kieu, and S. Pan, “Towards spatio-temporal aware traffic time series forecasting,” in 2022 IEEE 38th International Conference on Data Engineering (ICDE).   IEEE, 2022, pp. 2900–2913.
  6. Y. Qin, H. Luo, F. Zhao, Y. Fang, X. Tao, and C. Wang, “Spatio-temporal hierarchical mlp network for traffic forecasting,” Information Sciences, vol. 632, pp. 543–554, 2023.
  7. F. Zhou, Q. Yang, K. Zhang, G. Trajcevski, T. Zhong, and A. Khokhar, “Reinforced spatiotemporal attentive graph neural networks for traffic forecasting,” IEEE Internet of Things Journal, vol. 7, no. 7, pp. 6414–6428, 2020.
  8. S. F. Stefenon, L. O. Seman, L. S. Aquino, and L. dos Santos Coelho, “Wavelet-seq2seq-lstm with attention for time series forecasting of level of dams in hydroelectric power plants,” Energy, vol. 274, p. 127350, 2023.
  9. J. Han, G. H. Lee, S. Park, J. Lee, and J. K. Choi, “A multivariate-time-series-prediction-based adaptive data transmission period control algorithm for iot networks,” IEEE Internet of Things Journal, vol. 9, no. 1, pp. 419–436, 2021.
  10. Q. Hua, D. Yang, S. Qian, H. Hu, J. Cao, and G. Xue, “Kae-informer: A knowledge auto-embedding informer for forecasting long-term workloads of microservices,” in Proceedings of the ACM Web Conference 2023, 2023, pp. 1551–1561.
  11. K. Yi, Q. Zhang, W. Fan, S. Wang, P. Wang, H. He, N. An, D. Lian, L. Cao, and Z. Niu, “Frequency-domain MLPs are more effective learners in time series forecasting,” in Thirty-seventh Conference on Neural Information Processing Systems, 2023.
  12. Z. Ni, H. Yu, S. Liu, J. Li, and W. Lin, “Basisformer: Attention-based time series forecasting with learnable and interpretable basis,” in Advances in Neural Information Processing Systems, 2023.
  13. S. Li, X. Jin, Y. Xuan, X. Zhou, W. Chen, Y.-X. Wang, and X. Yan, “Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting,” Advances in neural information processing systems, vol. 32, 2019.
  14. H. Zhou, S. Zhang, J. Peng, S. Zhang, J. Li, H. Xiong, and W. Zhang, “Informer: Beyond efficient transformer for long sequence time-series forecasting,” in Proceedings of the AAAI conference on artificial intelligence, vol. 35, no. 12, 2021, pp. 11 106–11 115.
  15. N. Kitaev, L. Kaiser, and A. Levskaya, “Reformer: The efficient transformer,” in International Conference on Learning Representations, 2019.
  16. H. Wu, J. Xu, J. Wang, and M. Long, “Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting,” Advances in Neural Information Processing Systems, vol. 34, pp. 22 419–22 430, 2021.
  17. T. Zhou, Z. Ma, Q. Wen, X. Wang, L. Sun, and R. Jin, “Fedformer: Frequency enhanced decomposed transformer for long-term series forecasting,” in International Conference on Machine Learning.   PMLR, 2022, pp. 27 268–27 286.
  18. M. A. Shabani, A. H. Abdi, L. Meng, and T. Sylvain, “Scaleformer: Iterative multi-scale refining transformers for time series forecasting,” in The Eleventh International Conference on Learning Representations, 2022.
  19. S. Liu, H. Yu, C. Liao, J. Li, W. Lin, A. X. Liu, and S. Dustdar, “Pyraformer: Low-complexity pyramidal attention for long-range time series modeling and forecasting,” in International Conference on Learning Representations, 2022.
  20. Z. Chen, D. Chen, X. Zhang, Z. Yuan, and X. Cheng, “Learning graph structures with transformer for multivariate time-series anomaly detection in iot,” IEEE Internet of Things Journal, vol. 9, no. 12, pp. 9179–9189, 2021.
  21. Y. Zhang and J. Yan, “Crossformer: Transformer utilizing cross-dimension dependency for multivariate time series forecasting,” in International Conference on Learning Representations, 2023.
  22. Z. Zhang, X. Wang, and Y. Gu, “Sageformer: Series-aware graph-enhanced transformers for multivariate time series forecasting,” arXiv preprint arXiv:2307.01616, 2023.
  23. A. Zeng, M. Chen, L. Zhang, and Q. Xu, “Are transformers effective for time series forecasting?” in Proceedings of the AAAI conference on artificial intelligence, vol. 37, no. 9, 2023, pp. 11 121–11 128.
  24. Y. Nie, N. H. Nguyen, P. Sinthong, and J. Kalagnanam, “A time series is worth 64 words: Long-term forecasting with transformers,” in International Conference on Learning Representations, 2023.
  25. L. Han, H. Ye, and D. Zhan, “The capacity and robustness trade-off: Revisiting the channel independent strategy for multivariate time series forecasting,” CoRR, vol. abs/2304.05206, 2023.
  26. S. Lin, W. Lin, W. Wu, S. Wang, and Y. Wang, “Petformer: Long-term time series forecasting via placeholder-enhanced transformer,” arXiv preprint arXiv:2308.04791, 2023.
  27. H. Wang, Y. Mo, N. Yin, H. Dai, B. Li, S. Fan, and S. Mo, “Dance of channel and sequence: An efficient attention-based approach for multivariate time series forecasting,” arXiv preprint arXiv:2312.06220, 2023.
  28. Y. Liu, T. Hu, H. Zhang, H. Wu, S. Wang, L. Ma, and M. Long, “itransformer: Inverted transformers are effective for time series forecasting,” arXiv preprint arXiv:2310.06625, 2023.
  29. A. Das, W. Kong, A. Leach, R. Sen, and R. Yu, “Long-term forecasting with tide: Time-series dense encoder,” arXiv preprint arXiv:2304.08424, 2023.
  30. Z. C. Lipton, J. Berkowitz, and C. Elkan, “A critical review of recurrent neural networks for sequence learning,” arXiv preprint arXiv:1506.00019, 2015.
  31. D. J. Bartholomew, “Time series analysis forecasting and control.” 1971.
  32. G. Lai, W.-C. Chang, Y. Yang, and H. Liu, “Modeling long-and short-term temporal patterns with deep neural networks,” in The 41st international ACM SIGIR conference on research & development in information retrieval, 2018, pp. 95–104.
  33. D. Salinas, V. Flunkert, J. Gasthaus, and T. Januschowski, “Deepar: Probabilistic forecasting with autoregressive recurrent networks,” International Journal of Forecasting, vol. 36, no. 3, pp. 1181–1191, 2020.
  34. H. Wang, J. Peng, F. Huang, J. Wang, J. Chen, and Y. Xiao, “Micn: Multi-scale local and global context modeling for long-term series forecasting,” in The Eleventh International Conference on Learning Representations, 2022.
  35. H. Wu, T. Hu, Y. Liu, H. Zhou, J. Wang, and M. Long, “Timesnet: Temporal 2d-variation modeling for general time series analysis,” in International Conference on Learning Representations, 2023.
  36. A. Dosovitskiy, L. Beyer, A. Kolesnikov, D. Weissenborn, X. Zhai, T. Unterthiner, M. Dehghani, M. Minderer, G. Heigold, S. Gelly et al., “An image is worth 16x16 words: Transformers for image recognition at scale,” in International Conference on Learning Representations, 2020.
  37. J. Gao, W. Hu, and Y. Chen, “Client: Cross-variable linear integrated enhanced transformer for multivariate long-term time series forecasting,” arXiv preprint arXiv:2305.18838, 2023.
  38. T. Kim, J. Kim, Y. Tae, C. Park, J.-H. Choi, and J. Choo, “Reversible instance normalization for accurate time-series forecasting against distribution shift,” in International Conference on Learning Representations, 2021. [Online]. Available: https://openreview.net/forum?id=cGDAkQo1C0p
  39. A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, Ł. Kaiser, and I. Polosukhin, “Attention is all you need,” Advances in neural information processing systems, vol. 30, 2017.
  40. M. Liu, A. Zeng, M. Chen, Z. Xu, Q. Lai, L. Ma, and Q. Xu, “Scinet: Time series modeling and forecasting with sample convolution and interaction,” Advances in Neural Information Processing Systems, vol. 35, pp. 5816–5828, 2022.
Citations (1)

Summary

We haven't generated a summary for this paper yet.