Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Parsimony or Capability? Decomposition Delivers Both in Long-term Time Series Forecasting (2401.11929v4)

Published 22 Jan 2024 in cs.LG

Abstract: Long-term time series forecasting (LTSF) represents a critical frontier in time series analysis, characterized by extensive input sequences, as opposed to the shorter spans typical of traditional approaches. While longer sequences inherently offer richer information for enhanced predictive precision, prevailing studies often respond by escalating model complexity. These intricate models can inflate into millions of parameters, resulting in prohibitive parameter scales. Our study demonstrates, through both analytical and empirical evidence, that decomposition is key to containing excessive model inflation while achieving uniformly superior and robust results across various datasets. Remarkably, by tailoring decomposition to the intrinsic dynamics of time series data, our proposed model outperforms existing benchmarks, using over 99 \% fewer parameters than the majority of competing methods. Through this work, we aim to unleash the power of a restricted set of parameters by capitalizing on domain characteristics--a timely reminder that in the realm of LTSF, bigger is not invariably better.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Jinliang Deng (13 papers)
  2. Xuan Song (61 papers)
  3. Ivor W. Tsang (109 papers)
  4. Hui Xiong (244 papers)
  5. Feiyang Ye (17 papers)
  6. Du Yin (7 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.