Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

WFTNet: Exploiting Global and Local Periodicity in Long-term Time Series Forecasting (2309.11319v2)

Published 20 Sep 2023 in cs.LG

Abstract: Recent CNN and Transformer-based models tried to utilize frequency and periodicity information for long-term time series forecasting. However, most existing work is based on Fourier transform, which cannot capture fine-grained and local frequency structure. In this paper, we propose a Wavelet-Fourier Transform Network (WFTNet) for long-term time series forecasting. WFTNet utilizes both Fourier and wavelet transforms to extract comprehensive temporal-frequency information from the signal, where Fourier transform captures the global periodic patterns and wavelet transform captures the local ones. Furthermore, we introduce a Periodicity-Weighted Coefficient (PWC) to adaptively balance the importance of global and local frequency patterns. Extensive experiments on various time series datasets show that WFTNet consistently outperforms other state-of-the-art baseline. Code is available at https://github.com/Hank0626/WFTNet.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (21)
  1. Carmina Fjellström, “Long short-term memory neural network for financial time series,” in IEEE International Conference on Big Data. IEEE, 2022, pp. 3496–3504.
  2. “Multivariate time series dataset for space weather data analytics,” Scientific data, vol. 7, no. 1, pp. 227, 2020.
  3. “Freeway performance measurement system: mining loop detector data,” Transportation Research Record, vol. 1748, no. 1, pp. 96–102, 2001.
  4. “Informer: Beyond efficient transformer for long sequence time-series forecasting,” in AAAI, 2021, vol. 35, pp. 11106–11115.
  5. “Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting,” Advances in Neural Information Processing Systems, vol. 34, pp. 22419–22430, 2021.
  6. “Pyraformer: Low-complexity pyramidal attention for long-range time series modeling and forecasting,” in International Conference on Learning Representations, 2021.
  7. “FEDformer: Frequency enhanced decomposed transformer for long-term series forecasting,” in International Conference on Machine Learning. PMLR, 2022, pp. 27268–27286.
  8. “ETSformer: Exponential smoothing transformers for time-series forecasting,” arXiv preprint arXiv:2202.01381, 2022.
  9. “A time series is worth 64 words: Long-term forecasting with transformers,” in International Conference on Learning Representations, 2023.
  10. “Are transformers effective for time series forecasting?,” in AAAI, 2023, vol. 37, pp. 11121–11128.
  11. “TimesNet: Temporal 2d-variation modeling for general time series analysis,” in International Conference on Learning Representations, 2023.
  12. “A practical guide to wavelet analysis,” Bulletin of the American Meteorological society, vol. 79, no. 1, pp. 61–78, 1998.
  13. Brad Osgood, “The fourier transform and its applications,” Lecture notes for EE, vol. 261, pp. 20, 2009.
  14. “The fast fourier transform,” IEEE Spectrum, vol. 4, no. 12, pp. 63–70, 1967.
  15. Ingram J Brown, “A wavelet tour of signal processing: the sparse way.,” Investigacion Operacional, vol. 30, no. 1, pp. 85–87, 2009.
  16. “Decomposition of hardy functions into square integrable wavelets of constant shape,” SIAM journal on mathematical analysis, vol. 15, no. 4, pp. 723–736, 1984.
  17. “MICN: Multi-scale local and global context modeling for long-term series forecasting,” in International Conference on Learning Representations, 2023.
  18. “SCINet: Time series modeling and forecasting with sample convolution and interaction,” Advances in Neural Information Processing Systems, vol. 35, pp. 5816–5828, 2022.
  19. “Deep residual learning for image recognition,” in Computer Vision and Pattern Recognition, 2016, pp. 770–778.
  20. “Inception-v4, inception-resnet and the impact of residual connections on learning,” in AAAI, 2017, vol. 31, pp. 4278–4284.
  21. “Modeling long-and short-term temporal patterns with deep neural networks,” in The 41st international ACM SIGIR conference on research & development in information retrieval, 2018, pp. 95–104.
Citations (3)

Summary

We haven't generated a summary for this paper yet.