Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Adaptive Convolutional Forecasting Network Based on Time Series Feature-Driven (2405.12038v2)

Published 20 May 2024 in cs.LG and cs.IR

Abstract: Time series data in real-world scenarios contain a substantial amount of nonlinear information, which significantly interferes with the training process of models, leading to decreased prediction performance. Therefore, during the time series forecasting process, extracting the local and global time series patterns and understanding the potential nonlinear features among different time observations are highly significant. To address this challenge, we introduce multi-resolution convolution and deformable convolution operations. By enlarging the receptive field using convolution kernels with different dilation factors to capture temporal correlation information at different resolutions, and adaptively adjusting the sampling positions through additional offset vectors, we enhance the network's ability to capture potential nonlinear features among time observations. Building upon this, we propose ACNet, an adaptive convolutional network designed to effectively model the local and global temporal dependencies and the nonlinear features between observations in multivariate time series. Specifically, by extracting and fusing time series features at different resolutions, we capture both local contextual information and global patterns in the time series. The designed nonlinear feature adaptive extraction module captures the nonlinear features among different time observations in the time series. We evaluated the performance of ACNet across twelve real-world datasets. The results indicate that ACNet consistently achieves state-of-the-art performance in both short-term and long-term forecasting tasks with favorable runtime efficiency.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (40)
  1. Mvts-library: An open library for deep multivariate time series forecasting, knowl.-Based Syst. 283 (2024).
  2. Graphformer: Adaptive graph correlation transformer for multivariate long sequence time series forecasting, knowl.-Based Syst. 285 (2024) 111321.
  3. Feat: A general framework for feature-aware multivariate time-series representation learning, knowl.-Based Syst. 277 (2023) 110790.
  4. Long-term multivariate time series forecasting in data centers based on multi-factor separation evolutionary spatialâ temporal graph neural networks, knowl.-Based Syst. 280 (2023) 110997.
  5. A lightweight multi-layer perceptron for efficient multivariate time series forecasting, knowl.-Based Syst. 288 (2024) 111463.
  6. Transferable graph structure learning for graph-based traffic forecasting across cities, in: Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, 2023, pp. 1032–1043.
  7. A time series is worth 64 words: Long-term forecasting with transformers, in: The Eleventh International Conference on Learning Representations, 2023.
  8. itransformer: Inverted transformers are effective for time series forecasting, arXiv preprint arXiv:2310.06625 (2023).
  9. Crossformer: Cross spatio-temporal transformer for 3d human pose estimation, arXiv preprint arXiv:2203.13387 (2022).
  10. Are transformers effective for time series forecasting?, in: Proceedings of the AAAI conference on artificial intelligence, volume 37, 2023, pp. 11121–11128.
  11. Tsmixer: Lightweight mlp-mixer model for multivariate time series forecasting, in: Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, KDD ’23, 2023, pp. 459–469.
  12. Convtimenet: A deep hierarchical fully convolutional model for multivariate time series analysis, arXiv preprint arXiv:2403.01493 (2024).
  13. Micn: Multi-scale local and global context modeling for long-term series forecasting, in: The Eleventh International Conference on Learning Representations, 2022.
  14. A hyperspectral image classification method based on pyramid feature extraction with deformable- dilated convolution, IEEE Geosci. and Remote Sens. Lett. 21 (2024).
  15. Modeling spatial nonstationarity via deformable convolutions for deep traffic flow prediction, IEEE Trans. on Knowl. and Data Eng. 35 (2023) 2796–2808.
  16. Deformable convolutional networks, in: 2017 IEEE International Conference on Computer Vision (ICCV), 2017.
  17. Scinet: Time series modeling and forecasting with sample convolution and interaction, volume 35, 2022.
  18. Unsupervised scalable representation learning for multivariate time series, Le Centre pour la Communication Scientifique Directe - HAL - Diderot (2019).
  19. Efficiently modeling long sequences with structured state spaces, arXiv preprint arXiv:2111.00396 (2021).
  20. Modeling long- and short-term temporal patterns with deep neural networks, in: ACM/SIGIR Proceedings 2018, 2018, pp. 95–104.
  21. Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting, volume 27, 2021, pp. 22419–22430.
  22. Etsformer: Exponential smoothing transformers for time-series forecasting, arXiv preprint arXiv:2202.01381 (2022).
  23. Fedformer: Frequency enhanced decomposed transformer for long-term series forecasting, in: International conference on machine learning, 2022, pp. 27268–27286.
  24. Smartformer: Semi-autoregressive transformer with efficient integrated window attention for long time series forecasting, in: Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence, IJCAI-23, 2023, pp. 2169–2177. doi:10.24963/ijcai.2023/241.
  25. Pyraformer: Low-complexity pyramidal attention for long-range time series modeling and forecasting, 2022.
  26. Mlp4rec: A pure mlp architecture for sequential recommendations (2022).
  27. Frequency-domain mlps are more effective learners in time series forecasting, ArXiv abs/2311.06184 (2023).
  28. Timemixer: Decomposable multiscale mixing for time series forecasting, in: The Twelfth International Conference on Learning Representations, 2024.
  29. Radanet: Road augmented deformable dttention network for road extraction from complex high-resolution remote-sensing images, IEEE Trans. on Geosci. and Remote Sens. 61 (2023).
  30. S. Zhuo, J. Zhang, Attention-based deformable convolutional network for chinese various dynasties character recognition, Expert Syst. With Appl. 238 (2024).
  31. Aggregated-attention deformable convolutional network for few-shot sar jamming recognition, Pattern Recognit. 146 (2024).
  32. Dca-daffnet: An end-to-end network with deformable fusion attention and deep adaptive feature fusion for laryngeal tumor grading from histopathology images, IEEE Trans. on Instrum. and Meas. 72 (2023).
  33. Cascade transformer decoder based occluded pedestrian detection with dynamic deformable convolution and gaussian projection channel attention mechanism, IEEE Trans. on Multimed. 25 (2023) 1529–1537.
  34. Railway foreign body vibration signal detection based on wavelet analysis, J. of Vibroengineering 24 (2022) 1139–1147.
  35. Spatial pyramid pooling in deep convolutional networks for visual recognition, IEEE Trans. on Pattern Anal. & Mach. Intell. 37 (2015) 1904–1916.
  36. Hybrid dilated convolution with multi-scale residual fusion network for hyperspectral image classification, Micromachines 12 (2021) 545.
  37. A review of multilayer extreme learning machine neural networks, Artif. Intell. Rev. 56 (2023) 13691–13742.
  38. Non-stationary transformers: Exploring the stationarity in time series forecasting, Adv. in Neural Inf. Process. Syst. 35 (2022) 9881–9893.
  39. Timesnet: Temporal 2d-variation modeling for general time series analysis, in: The eleventh international conference on learning representations, 2022.
  40. Ts2vec: Towards universal representation of time series, in: Proceedings of the AAAI Conference on Artificial Intelligence, volume 36, 2022, pp. 8980–8987.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Dandan Zhang (51 papers)
  2. Yun Wang (229 papers)
  3. Zhiqiang Zhang (129 papers)
  4. Nanguang Chen (6 papers)

Summary

We haven't generated a summary for this paper yet.