Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Multi-scale Attention Flow for Probabilistic Time Series Forecasting (2205.07493v3)

Published 16 May 2022 in cs.LG

Abstract: The probability prediction of multivariate time series is a notoriously challenging but practical task. On the one hand, the challenge is how to effectively capture the cross-series correlations between interacting time series, to achieve accurate distribution modeling. On the other hand, we should consider how to capture the contextual information within time series more accurately to model multivariate temporal dynamics of time series. In this work, we proposed a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF), where we integrate multi-scale attention and relative position information and the multivariate data distribution is represented by the conditioned normalizing flow. Additionally, compared with autoregressive modeling methods, our model avoids the influence of cumulative error and does not increase the time complexity. Extensive experiments demonstrate that our model achieves state-of-the-art performance on many popular multivariate datasets.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Shibo Feng (5 papers)
  2. Chunyan Miao (145 papers)
  3. Ke Xu (309 papers)
  4. Jiaxiang Wu (27 papers)
  5. Pengcheng Wu (25 papers)
  6. Yang Zhang (1129 papers)
  7. Peilin Zhao (127 papers)
Citations (16)

Summary

We haven't generated a summary for this paper yet.