Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Towards Efficient Large Scale Spatial-Temporal Time Series Forecasting via Improved Inverted Transformers (2503.10858v1)

Published 13 Mar 2025 in cs.LG

Abstract: Time series forecasting at scale presents significant challenges for modern prediction systems, particularly when dealing with large sets of synchronized series, such as in a global payment network. In such systems, three key challenges must be overcome for accurate and scalable predictions: 1) emergence of new entities, 2) disappearance of existing entities, and 3) the large number of entities present in the data. The recently proposed Inverted Transformer (iTransformer) architecture has shown promising results by effectively handling variable entities. However, its practical application in large-scale settings is limited by quadratic time and space complexity ($O(N2)$) with respect to the number of entities $N$. In this paper, we introduce EiFormer, an improved inverted transformer architecture that maintains the adaptive capabilities of iTransformer while reducing computational complexity to linear scale ($O(N)$). Our key innovation lies in restructuring the attention mechanism to eliminate redundant computations without sacrificing model expressiveness. Additionally, we incorporate a random projection mechanism that not only enhances efficiency but also improves prediction accuracy through better feature representation. Extensive experiments on the public LargeST benchmark dataset and a proprietary large-scale time series dataset demonstrate that EiFormer significantly outperforms existing methods in both computational efficiency and forecasting accuracy. Our approach enables practical deployment of transformer-based forecasting in industrial applications where handling time series at scale is essential.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (13)
  1. Jiarui Sun (13 papers)
  2. Chin-Chia Michael Yeh (43 papers)
  3. Yujie Fan (25 papers)
  4. Xin Dai (27 papers)
  5. Xiran Fan (7 papers)
  6. Zhimeng Jiang (33 papers)
  7. Uday Singh Saini (12 papers)
  8. Vivian Lai (28 papers)
  9. Junpeng Wang (53 papers)
  10. Huiyuan Chen (43 papers)
  11. Zhongfang Zhuang (32 papers)
  12. Yan Zheng (102 papers)
  13. Girish Chowdhary (69 papers)

Summary

We haven't generated a summary for this paper yet.