Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Long-Range Transformers for Dynamic Spatiotemporal Forecasting (2109.12218v3)

Published 24 Sep 2021 in cs.LG and stat.ML

Abstract: Multivariate time series forecasting focuses on predicting future values based on historical context. State-of-the-art sequence-to-sequence models rely on neural attention between timesteps, which allows for temporal learning but fails to consider distinct spatial relationships between variables. In contrast, methods based on graph neural networks explicitly model variable relationships. However, these methods often rely on predefined graphs that cannot change over time and perform separate spatial and temporal updates without establishing direct connections between each variable at every timestep. Our work addresses these problems by translating multivariate forecasting into a "spatiotemporal sequence" formulation where each Transformer input token represents the value of a single variable at a given time. Long-Range Transformers can then learn interactions between space, time, and value information jointly along this extended sequence. Our method, which we call Spacetimeformer, achieves competitive results on benchmarks from traffic forecasting to electricity demand and weather prediction while learning spatiotemporal relationships purely from data.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Jake Grigsby (17 papers)
  2. Zhe Wang (574 papers)
  3. Nam Nguyen (46 papers)
  4. Yanjun Qi (68 papers)
Citations (75)

Summary

We haven't generated a summary for this paper yet.