WEITS: A Wavelet-enhanced residual framework for interpretable time series forecasting (2405.10877v1)
Abstract: Time series (TS) forecasting has been an unprecedentedly popular problem in recent years, with ubiquitous applications in both scientific and business fields. Various approaches have been introduced to time series analysis, including both statistical approaches and deep neural networks. Although neural network approaches have illustrated stronger ability of representation than statistical methods, they struggle to provide sufficient interpretablility, and can be too complicated to optimize. In this paper, we present WEITS, a frequency-aware deep learning framework that is highly interpretable and computationally efficient. Through multi-level wavelet decomposition, WEITS novelly infuses frequency analysis into a highly deep learning framework. Combined with a forward-backward residual architecture, it enjoys both high representation capability and statistical interpretability. Extensive experiments on real-world datasets have demonstrated competitive performance of our model, along with its additional advantage of high computation efficiency. Furthermore, WEITS provides a general framework that can always seamlessly integrate with state-of-the-art approaches for time series forecast.
- Allard, R. (1998). Use of time-series analysis in infectious disease surveillance. Bulletin of the World Health Organization, 76(4):327.
- Explainable deep neural networks for multivariate time series predictions. In IJCAI, pages 6488–6490.
- Aviv, Y. (2003). A time-series framework for supply-chain inventory management. Operations Research, 51(2):210–227.
- An empirical evaluation of generic convolutional and recurrent networks for sequence modeling. arXiv preprint arXiv:1803.01271.
- Financial time series forecasting model based on ceemdan and lstm. Physica A: Statistical mechanics and its applications, 519:127–139.
- N-hits: Neural hierarchical interpolation for time series forecasting. arxiv. arXiv preprint arXiv:2201.12886.
- Nhits: Neural hierarchical interpolation for time series forecasting. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 37, pages 6989–6997.
- Transformer interpretability beyond attention visualization. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pages 782–791.
- Towards learning universal hyperparameter optimizers with transformers. Advances in Neural Information Processing Systems, 35:32053–32068.
- Lag order and critical values of the augmented dickey–fuller test. Journal of Business & Economic Statistics, 13(3):277–280.
- Stl: A seasonal-trend decomposition. J. Off. Stat, 6(1):3–73.
- Long-term forecasting with tide: Time-series dense encoder. arXiv preprint arXiv:2304.08424.
- Daubechies, I. (1992). Ten lectures on wavelets. SIAM.
- Dietterich, T. G. (2000). Ensemble methods in machine learning. In International workshop on multiple classifier systems, pages 1–15. Springer.
- A review on ensembles for the class imbalance problem: bagging-, boosting-, and hybrid-based approaches. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 42(4):463–484.
- Midas regressions: Further results and new directions. Econometric reviews, 26(1):53–90.
- Exploring interpretable lstm neural networks over multi-variable data. In International conference on machine learning, pages 2494–2504. PMLR.
- An interpretable lstm neural network for autoregressive exogenous model. arXiv preprint arXiv:1804.05251.
- Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 770–778.
- The blueprint for an ai bill of rights: in search of enaction, at risk of inaction. Minds and Machines, pages 1–8.
- Densely connected convolutional networks. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 4700–4708.
- Automatic time series forecasting: the forecast package for r. Journal of statistical software, 27:1–22.
- A prediction approach for stock market volatility based on time series data. IEEE Access, 7:17287–17298.
- Batch normalization: Accelerating deep network training by reducing internal covariate shift. In International conference on machine learning, pages 448–456. pmlr.
- Benchmarking deep learning interpretability in time series predictions. Advances in neural information processing systems, 33:6441–6452.
- Time series forecasting based on wavelet filtering. Expert Systems with Applications, 42(8):3868–3874.
- Reformer: The efficient transformer. arXiv preprint arXiv:2001.04451.
- Modeling long-and short-term temporal patterns with deep neural networks. In The 41st international ACM SIGIR conference on research & development in information retrieval, pages 95–104.
- Pywavelets: A python package for wavelet analysis. Journal of Open Source Software, 4(36):1237.
- Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting. Advances in neural information processing systems, 32.
- On the importance of interpretable machine learning predictions to inform clinical decision making in oncology. Frontiers in Oncology, 13:1129380.
- Stock index prediction based on time series decomposition and hybrid model. Entropy, 24(2):146.
- Deep factors with gaussian processes for forecasting. arXiv preprint arXiv:1812.00098.
- The m4 competition: Results, findings, conclusion and way forward. International Journal of Forecasting, 34(4):802–808.
- The m4 competition: 100,000 time series and 61 forecasting methods. International Journal of Forecasting, 36(1):54–74.
- Mallat, S. G. (1989). A theory for multiresolution signal decomposition: the wavelet representation. IEEE transactions on pattern analysis and machine intelligence, 11(7):674–693.
- Wavelet toolbox. The MathWorks Inc., Natick, MA, 15:21.
- A time series is worth 64 words: Long-term forecasting with transformers. arXiv preprint arXiv:2211.14730.
- N-beats: Neural basis expansion analysis for interpretable time series forecasting. arXiv preprint arXiv:1905.10437.
- Pytorch: An imperative style, high-performance deep learning library. Advances in neural information processing systems, 32.
- Predicting stock and stock price index movement using trend deterministic data preparation and machine learning techniques. Expert systems with applications, 42(1):259–268.
- Wavelet methods for time series analysis, volume 4. Cambridge university press.
- A deep neural framework for sales forecasting in e-commerce. In Proceedings of the 28th ACM international conference on information and knowledge management, pages 299–308.
- Deep state space models for time series forecasting. Advances in neural information processing systems, 31.
- Ensemble learning: A survey. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, 8(4):e1249.
- Deepar: Probabilistic forecasting with autoregressive recurrent networks. International Journal of Forecasting, 36(3):1181–1191.
- Seasonal adjustment by x-13arima-seats in r. Journal of Statistical Software, 87:1–17.
- Financial time series forecasting with deep learning: A systematic literature review: 2005–2019. Applied soft computing, 90:106181.
- Fusion of wavelet decomposition and n-beats for improved stock market forecasting.
- Sovrasov, V. (2019). Flops counter for convolutional networks in pytorch framework.
- Wavelet transform techniques for image compression-an evaluation. International journal of image, graphics and signal processing, 6(2):54.
- The haar wavelet transform: its status and achievements. Computers & Electrical Engineering, 29(1):25–44.
- Manifold-constrained gaussian process inference for time-varying parameters in dynamic systems. arXiv preprint arXiv:2105.13407.
- Surana, A. (2020). Koopman operator framework for time series modeling and analysis. Journal of Nonlinear Science, 30:1973–2006.
- A complete ensemble empirical mode decomposition with adaptive noise. In 2011 IEEE international conference on acoustics, speech and signal processing (ICASSP), pages 4144–4147. IEEE.
- Exploration, exploitation, and financial performance: Analysis of s&p 500 corporations. Strategic management journal, 30(2):221–231.
- Attention is all you need. Advances in neural information processing systems, 30.
- Residual networks behave like ensembles of relatively shallow networks. Advances in neural information processing systems, 29.
- Multilevel wavelet decomposition network for interpretable time series analysis. In Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pages 2437–2446.
- A multi-horizon quantile recurrent forecaster. arXiv preprint arXiv:1711.11053.
- Transformers: State-of-the-art natural language processing. In Proceedings of the 2020 conference on empirical methods in natural language processing: system demonstrations, pages 38–45.
- Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting. Advances in Neural Information Processing Systems, 34:22419–22430.
- Long-term forecasting using tensor-train rnns. Arxiv.
- Are transformers effective for time series forecasting? In Proceedings of the AAAI conference on artificial intelligence, volume 37, pages 11121–11128.
- Informer: Beyond efficient transformer for long sequence time-series forecasting. In Proceedings of the AAAI conference on artificial intelligence, volume 35, pages 11106–11115.
- Fedformer: Frequency enhanced decomposed transformer for long-term series forecasting. In International Conference on Machine Learning, pages 27268–27286. PMLR.