WFTNet: Exploiting Global and Local Periodicity in Long-term Time Series Forecasting (2309.11319v2)
Abstract: Recent CNN and Transformer-based models tried to utilize frequency and periodicity information for long-term time series forecasting. However, most existing work is based on Fourier transform, which cannot capture fine-grained and local frequency structure. In this paper, we propose a Wavelet-Fourier Transform Network (WFTNet) for long-term time series forecasting. WFTNet utilizes both Fourier and wavelet transforms to extract comprehensive temporal-frequency information from the signal, where Fourier transform captures the global periodic patterns and wavelet transform captures the local ones. Furthermore, we introduce a Periodicity-Weighted Coefficient (PWC) to adaptively balance the importance of global and local frequency patterns. Extensive experiments on various time series datasets show that WFTNet consistently outperforms other state-of-the-art baseline. Code is available at https://github.com/Hank0626/WFTNet.
- Carmina Fjellström, “Long short-term memory neural network for financial time series,” in IEEE International Conference on Big Data. IEEE, 2022, pp. 3496–3504.
- “Multivariate time series dataset for space weather data analytics,” Scientific data, vol. 7, no. 1, pp. 227, 2020.
- “Freeway performance measurement system: mining loop detector data,” Transportation Research Record, vol. 1748, no. 1, pp. 96–102, 2001.
- “Informer: Beyond efficient transformer for long sequence time-series forecasting,” in AAAI, 2021, vol. 35, pp. 11106–11115.
- “Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting,” Advances in Neural Information Processing Systems, vol. 34, pp. 22419–22430, 2021.
- “Pyraformer: Low-complexity pyramidal attention for long-range time series modeling and forecasting,” in International Conference on Learning Representations, 2021.
- “FEDformer: Frequency enhanced decomposed transformer for long-term series forecasting,” in International Conference on Machine Learning. PMLR, 2022, pp. 27268–27286.
- “ETSformer: Exponential smoothing transformers for time-series forecasting,” arXiv preprint arXiv:2202.01381, 2022.
- “A time series is worth 64 words: Long-term forecasting with transformers,” in International Conference on Learning Representations, 2023.
- “Are transformers effective for time series forecasting?,” in AAAI, 2023, vol. 37, pp. 11121–11128.
- “TimesNet: Temporal 2d-variation modeling for general time series analysis,” in International Conference on Learning Representations, 2023.
- “A practical guide to wavelet analysis,” Bulletin of the American Meteorological society, vol. 79, no. 1, pp. 61–78, 1998.
- Brad Osgood, “The fourier transform and its applications,” Lecture notes for EE, vol. 261, pp. 20, 2009.
- “The fast fourier transform,” IEEE Spectrum, vol. 4, no. 12, pp. 63–70, 1967.
- Ingram J Brown, “A wavelet tour of signal processing: the sparse way.,” Investigacion Operacional, vol. 30, no. 1, pp. 85–87, 2009.
- “Decomposition of hardy functions into square integrable wavelets of constant shape,” SIAM journal on mathematical analysis, vol. 15, no. 4, pp. 723–736, 1984.
- “MICN: Multi-scale local and global context modeling for long-term series forecasting,” in International Conference on Learning Representations, 2023.
- “SCINet: Time series modeling and forecasting with sample convolution and interaction,” Advances in Neural Information Processing Systems, vol. 35, pp. 5816–5828, 2022.
- “Deep residual learning for image recognition,” in Computer Vision and Pattern Recognition, 2016, pp. 770–778.
- “Inception-v4, inception-resnet and the impact of residual connections on learning,” in AAAI, 2017, vol. 31, pp. 4278–4284.
- “Modeling long-and short-term temporal patterns with deep neural networks,” in The 41st international ACM SIGIR conference on research & development in information retrieval, 2018, pp. 95–104.