Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting (2106.13008v5)

Published 24 Jun 2021 in cs.LG and cs.AI

Abstract: Extending the forecasting time is a critical demand for real applications, such as extreme weather early warning and long-term energy consumption planning. This paper studies the long-term forecasting problem of time series. Prior Transformer-based models adopt various self-attention mechanisms to discover the long-range dependencies. However, intricate temporal patterns of the long-term future prohibit the model from finding reliable dependencies. Also, Transformers have to adopt the sparse versions of point-wise self-attentions for long series efficiency, resulting in the information utilization bottleneck. Going beyond Transformers, we design Autoformer as a novel decomposition architecture with an Auto-Correlation mechanism. We break with the pre-processing convention of series decomposition and renovate it as a basic inner block of deep models. This design empowers Autoformer with progressive decomposition capacities for complex time series. Further, inspired by the stochastic process theory, we design the Auto-Correlation mechanism based on the series periodicity, which conducts the dependencies discovery and representation aggregation at the sub-series level. Auto-Correlation outperforms self-attention in both efficiency and accuracy. In long-term forecasting, Autoformer yields state-of-the-art accuracy, with a 38% relative improvement on six benchmarks, covering five practical applications: energy, traffic, economics, weather and disease. Code is available at this repository: \url{https://github.com/thuml/Autoformer}.

Overview of "Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting"

The paper "Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting" introduces Autoformer, a pioneering model aimed at addressing long-term time series forecasting challenges. Traditional Transformer-based models, although adept at capturing long-range dependencies through various self-attention mechanisms, struggle with intricate temporal patterns inherent in long-term series forecasting and are computationally inefficient for lengthy input sequences due to quadratic complexity. Autoformer innovatively surpasses these limitations by embedding a decomposition architecture alongside an Auto-Correlation mechanism.

Key Contributions and Methodology

The Autoformer model transforms the conventional approach to series decomposition from a pre-processing step into an integral component of the model architecture. The primary contributions include:

  1. Progressive Decomposition Architecture: The introduction of an inner series decomposition block within the encoder-decoder structure enables the model to progressively disentangle and refine the trend-cyclical and seasonal components of the series throughout the forecasting process. This innovation enhances the model’s ability to manage complex temporal patterns by continuously smoothing the predicted hidden variables using a moving average to emphasize long-term trends.
  2. Auto-Correlation Mechanism: Inspired by stochastic process theory, the Auto-Correlation mechanism capitalizes on the periodicity of time series data. It identifies dependencies at a sub-series level based on their auto-correlation, thereby incorporating global information more efficiently than the traditional point-wise self-attention mechanisms. This mechanism achieves O(LlogL)\mathcal{O}(L\log L) complexity, offering significant computational advantages over canonical self-attention, which is crucial for long-term forecasting with large datasets.

Experimental Validation

Autoformer was subjected to comprehensive evaluations across six real-world benchmarks (ETT, Electricity, Exchange, Traffic, Weather, ILI), spanning five distinct applications: energy, traffic, economics, weather, and disease forecasting. The experiments demonstrated Autoformer's state-of-the-art performance, with a notable 38% average improvement in MSE across long-term forecasting tasks. Particularly striking results included 74% MSE reduction in long-term forecasting tasks compared to other models on varied datasets. Besides, Autoformer exhibited robust long-term prediction stability, outperforming existing models such as Informer, Reformer, LogTrans, and LSTNet.

Implications and Future Directions

Autoformer's advancements hold significant implications both practically and theoretically:

  • Practical Applications: The model’s ability to provide reliable long-term forecasts can be transformative across domains requiring forward-looking decision-making and planning, such as energy consumption management, traffic optimization, economic forecasting, weather prediction, and epidemic propagation control.
  • Theoretical Development: The integration of decomposition mechanisms directly within the model architecture coupled with the Auto-Correlation mechanism sets a precedent for further research into efficient aggregation of long-range dependencies. This not only invites advancements in time series analysis but also suggests broader applications in natural language processing and other sequential data tasks where long-term dependencies are prevalent.

Future Developments in AI

Looking forward, the research opens several avenues for continued exploration in AI:

  • Enhanced Decomposition Techniques: Further refining the decomposition process and integrating other decomposition algorithms within deep models could yield even more precise forecasts, particularly in datasets with weaker periodic signals.
  • Generalization Across Domains: Extending the principles of Autoformer to other data types and domains beyond time series forecasting could validate its versatility and robustness in capturing long-term dependencies.
  • Scalability and Optimization: Continued focus on optimizing computational efficiency and scaling model architectures to handle increasingly larger datasets without compromising on prediction accuracy will be crucial in making these advancements viable for real-world applications.

In conclusion, Autoformer represents a significant step forward in the field of long-term series forecasting. By intelligently embedding decomposition as an intrinsic model component and leveraging periodicity through Auto-Correlation, it addresses the critical challenges of complexity and efficiency faced by traditional Transformer-based models. The state-of-the-art results across diverse benchmarks underscore its potential to reshape forecasting methodologies in various scientific and industrial fields.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Haixu Wu (26 papers)
  2. Jiehui Xu (4 papers)
  3. Jianmin Wang (119 papers)
  4. Mingsheng Long (110 papers)
Citations (1,598)
X Twitter Logo Streamline Icon: https://streamlinehq.com