Fi$^2$VTS: Time Series Forecasting Via Capturing Intra- and Inter-Variable Variations in the Frequency Domain (2407.21275v7)
Abstract: Time series forecasting (TSF) plays a crucial role in various applications, including medical monitoring and crop growth. Despite the advancements in deep learning methods for TSF, their capacity to predict long-term series remains constrained. This limitation arises from the failure to account for both intra- and inter-variable variations meanwhile. To mitigate this challenge, we introduce the Fi$2$VBlock, which leverages a \textbf{F}requency domain perspective to capture \textbf{i}ntra- and \textbf{i}nter-variable \textbf{V}ariations. After transforming into the frequency domain via the Frequency Transform Module, the Frequency Cross Attention between the real and imaginary parts is designed to obtain enhanced frequency representations and capture intra-variable variations. Furthermore, Inception blocks are employed to integrate information, thus capturing correlations across different variables. Our backbone network, Fi$2$VTS, employs a residual architecture by concatenating multiple Fi$2$VBlocks, thereby preventing degradation issues. Theoretically, we demonstrate that Fi$2$VTS achieves a substantial reduction in both time and memory complexity, decreasing from $\mathcal{O}(L2)$ to $\mathcal{O}(L)$ per Fi$2$VBlock computation. Empirical evaluations reveal that Fi$2$VTS outperforms other baselines on two benchmark datasets. The implementation code is accessible at \url{https://github.com/HITshenrj/Fi2VTS}.
- Energy time series forecasting based on pattern sequence similarity. IEEE Transactions on Knowledge and Data Engineering, 23(8):1230–1243, 2010.
- An adaptive backpropagation algorithm for long-term electricity load forecasting. Neural Computing and Applications, 34(1):477–491, 2022.
- Exchange rate forecasting, order flow and macroeconomic information. Journal of International Economics, 80(1):72–88, 2010.
- Modelling the behaviour of currency exchange rates with singular spectrum analysis and artificial neural networks. Stats, 3(2):137–157, 2020.
- Time-series analysis of continuous glucose monitoring data to predict treatment efficacy in patients with t2dm. The Journal of Clinical Endocrinology & Metabolism, 106(8):2187–2197, 2021.
- Blood glucose level time series forecasting: Nested deep ensemble learning lag fusion. Bioengineering, 10(4):487, 2023.
- Univariant time series forecasting of agriculture load by using lstm and gru rnns. In 2020 IEEE Students Conference on Engineering & Systems (SCES), pages 1–6. IEEE, 2020.
- Larry R Medsker and LC Jain. Recurrent neural networks. Design and Applications, 5(64-67):2, 2001.
- Learning long-term dependencies with gradient descent is difficult. IEEE transactions on neural networks, 5(2):157–166, 1994.
- Long short-term memory. Neural computation, 9(8):1735–1780, 1997.
- Empirical evaluation of gated recurrent neural networks on sequence modeling. In NIPS 2014 Workshop on Deep Learning, December 2014, 2014.
- Combining recurrent, convolutional, and continuous-time models with linear state space layers. Advances in neural information processing systems, 34:572–585, 2021.
- On the difficulty of training recurrent neural networks. In International conference on machine learning, pages 1310–1318. Pmlr, 2013.
- Attention is all you need. Advances in neural information processing systems, 30, 2017.
- Jacob Devlin Ming-Wei Chang Kenton and Lee Kristina Toutanova. Bert: Pre-training of deep bidirectional transformers for language understanding. In Proceedings of naacL-HLT, volume 1, page 2, 2019.
- An image is worth 16x16 words: Transformers for image recognition at scale. In International Conference on Learning Representations, 2020.
- Global filter networks for image classification. Advances in neural information processing systems, 34:980–993, 2021.
- Reformer: The efficient transformer. In International Conference on Learning Representations, 2019.
- Informer: Beyond efficient transformer for long sequence time-series forecasting. In Proceedings of the AAAI conference on artificial intelligence, volume 35, pages 11106–11115, 2021.
- Pyraformer: Low-complexity pyramidal attention for long-range time series modeling and forecasting. In International conference on learning representations, 2021.
- Non-stationary transformers: Exploring the stationarity in time series forecasting. Advances in Neural Information Processing Systems, 35:9881–9893, 2022.
- Are transformers effective for time series forecasting? In Proceedings of the AAAI conference on artificial intelligence, volume 37, pages 11121–11128, 2023.
- Predicting short-term variations in end-to-end cloud data transfer throughput using neural networks. IEEE Access, 2023.
- Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting. Advances in Neural Information Processing Systems, 34:22419–22430, 2021.
- Less is more: Fast multivariate time series forecasting with light sampling-oriented mlp structures. arXiv preprint arXiv:2207.01186, 2022.
- Fedformer: Frequency enhanced decomposed transformer for long-term series forecasting. In International Conference on Machine Learning, pages 27268–27286. PMLR, 2022.
- Etsformer: Exponential smoothing transformers for time-series forecasting. arXiv preprint arXiv:2202.01381, 2022.
- Timesnet: Temporal 2d-variation modeling for general time series analysis. In The Eleventh International Conference on Learning Representations, 2022.
- Zoran Cvetkovic. On discrete short-time fourier analysis. IEEE transactions on signal processing, 48(9):2628–2640, 2000.
- Short-time fourier transform: two fundamental properties and an optimal implementation. IEEE Transactions on Signal Processing, 51(5):1231–1242, 2003.
- Distinguishing causal and acausal temporal relations. In Pacific-Asia Conference on Knowledge Discovery and Data Mining, pages 234–240. Springer, 2003.
- Frederick W King. Hilbert Transforms: Volume 2, volume 2. Cambridge University Press, 2009.
- Vaclav Cizek. Discrete hilbert transform. IEEE Transactions on Audio and Electroacoustics, 18(4):340–343, 1970.
- The use of the hilbert transform in ecg signal analysis. Computers in biology and medicine, 31(5):399–406, 2001.
- Learning in the frequency domain. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pages 1740–1749, 2020.
- Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 770–778, 2016.
- Going deeper with convolutions. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 1–9, 2015.
- Modeling long-and short-term temporal patterns with deep neural networks. In The 41st international ACM SIGIR conference on research & development in information retrieval, pages 95–104, 2018.
- The uva/padova type 1 diabetes simulator: new features. Journal of diabetes science and technology, 8(1):26–34, 2014.
- Lintul3, a simulation model for nitrogen-limited situations: Application to rice. European Journal of Agronomy, 32(4):255–271, 2010.
- Neural ordinary differential equations. Advances in neural information processing systems, 31, 2018.
- Some recent advances in forecasting and control. Journal of the Royal Statistical Society. Series C (Applied Statistics), 17(2):91–109, 1968.
- Time series analysis: forecasting and control. John Wiley & Sons, 2015.
- Eric W Weisstein. Convolution theorem. From MathWorld-A Wolfram Web Resource, 2006a. URL http://mathworld. wolfram. com/ConvolutionTheorem. html, 2014.