Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Discrete Graph Structure Learning for Forecasting Multiple Time Series (2101.06861v3)

Published 18 Jan 2021 in cs.LG and stat.ML

Abstract: Time series forecasting is an extensively studied subject in statistics, economics, and computer science. Exploration of the correlation and causation among the variables in a multivariate time series shows promise in enhancing the performance of a time series model. When using deep neural networks as forecasting models, we hypothesize that exploiting the pairwise information among multiple (multivariate) time series also improves their forecast. If an explicit graph structure is known, graph neural networks (GNNs) have been demonstrated as powerful tools to exploit the structure. In this work, we propose learning the structure simultaneously with the GNN if the graph is unknown. We cast the problem as learning a probabilistic graph model through optimizing the mean performance over the graph distribution. The distribution is parameterized by a neural network so that discrete graphs can be sampled differentiably through reparameterization. Empirical evaluations show that our method is simpler, more efficient, and better performing than a recently proposed bilevel learning approach for graph structure learning, as well as a broad array of forecasting models, either deep or non-deep learning based, and graph or non-graph based.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Chao Shang (24 papers)
  2. Jie Chen (602 papers)
  3. Jinbo Bi (28 papers)
Citations (208)

Summary

Overview of Discrete Graph Structure Learning for Forecasting Multiple Time Series

This paper addresses the challenge of improving the forecasting of multiple time series by integrating graph structure learning with graph neural networks (GNNs). The authors propose a novel approach, termed Graph for Time Series (GTS), where both the graph structure among the series and the GNN for forecasting are learned jointly. This dual learning process offers advantages in forecast accuracy, particularly when explicit graph structures are unavailable or incomplete.

Key Contributions

  1. Unified Graph and Model Learning: The paper introduces a framework to learn graph structures concurrently with the GNN, circumventing issues faced by prior methods that treated the graph as an ad-hoc hyperparameter through bilevel optimization (e.g., LDS). By leveraging a unilevel optimization strategy, GTS integrates the graph learning directly within the model's training process.
  2. Graph Parameterization via Neural Networks: A significant methodological innovation in the GTS method is the parameterization of the graph through a neural network. This design allows for a probabilistic graph model where the discrete graph can be sampled differentiably, enabling seamless integration with the neural model's training.
  3. Reparameterization and Efficiency: Using the Gumbel-Softmax trick, the authors enable differentiable sampling from the Bernoulli distribution, making optimization feasible over probabilistic graph structures. This technique reduces the computational overhead compared to other approaches like LDS, which are computationally intensive due to their bilevel optimization.
  4. Empirical Validation: The empirical results reveal that the GTS method outperforms a variety of baseline models, including traditional time series methods and other GNN-based approaches. Notably, GTS demonstrates superior forecasting accuracy on several datasets, including traffic and power grid data.
  5. Impact of Regularization: The model supports regularization based on prior knowledge (such as known sparse structures), showing robustness in learning graphs that capture meaningful interactions among time series without diminishing forecast performance.

Theoretical and Practical Implications

From a theoretical viewpoint, the proposed method advances the domain of graph-based time series forecasting by integrating graph learning within the standard machine learning pipeline. It does not merely seek to reveal causal relationships but utilizes the graph as an ancillary structure to enhance forecasting capabilities. The practical implications are significant, as the model can be applied to various domains with multiple interdependent time series data, such as traffic management and energy grid monitoring.

The comparative analysis with other graph learning models emphasizes the scalability and efficiency of the GTS framework, further accentuated by its ease of integration with standard forecasting models like DCRNN.

Future Directions

Looking forward, the model's ability to adaptively learn and refine its graph structure in changing environments could be explored, potentially integrating online learning elements. Further research might also focus on extending the model to explicitly incorporate causality, aligning with advances in causal inference within probabilistic graphical models.

The incorporation of additional structural priors (e.g., those derived from domain knowledge) and the exploration of their impact on forecasting accuracy and graph interpretability pose another promising avenue for research. As the field progresses, the fusion of graph learning techniques with traditional time series analysis will continue to evolve, driving improvements in the forecasting of complex, interconnected systems.