Non-stationary Transformers: Exploring the Stationarity in Time Series Forecasting (2205.14415v4)
Abstract: Transformers have shown great power in time series forecasting due to their global-range modeling ability. However, their performance can degenerate terribly on non-stationary real-world data in which the joint distribution changes over time. Previous studies primarily adopt stationarization to attenuate the non-stationarity of original series for better predictability. But the stationarized series deprived of inherent non-stationarity can be less instructive for real-world bursty events forecasting. This problem, termed over-stationarization in this paper, leads Transformers to generate indistinguishable temporal attentions for different series and impedes the predictive capability of deep models. To tackle the dilemma between series predictability and model capability, we propose Non-stationary Transformers as a generic framework with two interdependent modules: Series Stationarization and De-stationary Attention. Concretely, Series Stationarization unifies the statistics of each input and converts the output with restored statistics for better predictability. To address the over-stationarization problem, De-stationary Attention is devised to recover the intrinsic non-stationary information into temporal dependencies by approximating distinguishable attentions learned from raw series. Our Non-stationary Transformers framework consistently boosts mainstream Transformers by a large margin, which reduces MSE by 49.43% on Transformer, 47.34% on Informer, and 46.89% on Reformer, making them the state-of-the-art in time series forecasting. Code is available at this repository: https://github.com/thuml/Nonstationary_Transformers.
- ECL load. https://archive.ics.uci.edu/ml/datasets/ElectricityLoadDiagrams20112014.
- Illness Dataset. https://gis.cdc.gov/grasp/fluview/fluportaldashboard.html.
- Traffic Dataset. http://pems.dot.ca.gov/.
- Weather Dataset. https://www.bgc-jena.mpg.de/wetter/.
- Invariance principle meets information bottleneck for out-of-distribution generalization. NeurIPS, 2021.
- O. Anderson and M. Kendall. Time-series. 2nd edn. J. R. Stat. Soc. (Series D), 1976.
- Time series analysis, forecasting and control. 1970.
- Some recent advances in forecasting and control. J. R. Stat. Soc. (Series-C), 1968.
- N-hits: Neural hierarchical interpolation for time series forecasting. arXiv preprint arXiv:2201.12886, 2022.
- Decision transformer: Reinforcement learning via sequence modeling. NeurIPS, 2021.
- Learning phrase representations using rnn encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078, 2014.
- Bert: Pre-training of deep bidirectional transformers for language understanding. In NAACL-HLT, 2019.
- An image is worth 16x16 words: Transformers for image recognition at scale. In ICLR, 2021.
- Efficient tests for an autoregressive unit root. Econometrica, 1996.
- Combining recurrent, convolutional, and continuous-time models with linear state-space layers. NeurIPS, 2021.
- Forecasting: principles and practice. OTexts, 2018.
- Reversible instance normalization for accurate time-series forecasting against distribution shift. In ICLR, 2022.
- Adam: A method for stochastic optimization. In ICLR, 2015.
- Reformer: The efficient transformer. In ICLR, 2020.
- Modeling long-and short-term temporal patterns with deep neural networks. In SIGIR, 2018.
- Deeper, broader and artier domain generalization. In ICCV, 2017.
- Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting. In NeurIPS, 2019.
- Pyraformer: Low-complexity pyramidal attention for long-range time series modeling and forecasting. In ICLR, 2021.
- Swin transformer: Hierarchical vision transformer using shifted windows. In ICCV, 2021.
- Deep factors with gaussian processes for forecasting. arXiv preprint arXiv:1812.00098, 2018.
- Adaptive normalization: A novel data normalization approach for non-stationary time series. In IJCNN, 2010.
- N-BEATS: Neural basis expansion analysis for interpretable time series forecasting. ICLR, 2019.
- A survey on transfer learning. TKDE, 2009.
- Deep adaptive input normalization for time series forecasting. TNNLS, 2019.
- Pytorch: An imperative style, high-performance deep learning library. In NeurIPS, 2019.
- Deep state space models for time series forecasting. In NeurIPS, 2018.
- DeepAR: Probabilistic forecasting with autoregressive recurrent networks. Int. J. Forecast., 2020.
- Instance normalization: The missing ingredient for fast stylization. arXiv preprint arXiv:1607.08022, 2016.
- Attention is all you need. In NeurIPS, 2017.
- A multi-horizon quantile recurrent forecaster. NeurIPS, 2017.
- Etsformer: Exponential smoothing transformers for time-series forecasting. arXiv preprint arXiv:1406.1078, 2022.
- Autoformer: Decomposition transformers with Auto-Correlation for long-term series forecasting. In NeurIPS, 2021.
- Long-term forecasting using tensor-train rnns. arXiv preprint arXiv:1711.00073, 2017.
- Informer: Beyond efficient transformer for long sequence time-series forecasting. In AAAI, 2021.
- FEDformer: Frequency enhanced decomposed transformer for long-term series forecasting. In ICML, 2022.
- Yong Liu (721 papers)
- Haixu Wu (26 papers)
- Jianmin Wang (119 papers)
- Mingsheng Long (110 papers)