Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Adaptive Dependency Learning Graph Neural Networks (2312.03903v1)

Published 6 Dec 2023 in cs.LG

Abstract: Graph Neural Networks (GNN) have recently gained popularity in the forecasting domain due to their ability to model complex spatial and temporal patterns in tasks such as traffic forecasting and region-based demand forecasting. Most of these methods require a predefined graph as input, whereas in real-life multivariate time series problems, a well-predefined dependency graph rarely exists. This requirement makes it harder for GNNs to be utilised widely for multivariate forecasting problems in other domains such as retail or energy. In this paper, we propose a hybrid approach combining neural networks and statistical structure learning models to self-learn the dependencies and construct a dynamically changing dependency graph from multivariate data aiming to enable the use of GNNs for multivariate forecasting even when a well-defined graph does not exist. The statistical structure modeling in conjunction with neural networks provides a well-principled and efficient approach by bringing in causal semantics to determine dependencies among the series. Finally, we demonstrate significantly improved performance using our proposed approach on real-world benchmark datasets without a pre-defined dependency graph.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (50)
  1. Exploiting dynamic spatio-temporal correlations for citywide traffic flow prediction using attention based neural networks. Information Sciences 577, 852–870.
  2. Neural machine translation by jointly learning to align and translate. ICLR 2015.
  3. An empirical evaluation of generic convolutional and recurrent networks for sequence modeling. arXiv preprint arXiv:1803.01271 .
  4. Improving the accuracy of global forecasting models using time series data augmentation. Pattern Recognition , 108148.
  5. Time Series Analysis: Forecasting and Control. Holden-Day.
  6. A novel graph convolutional feature based convolutional neural network for stock trend prediction. Information Sciences 556, 67–94.
  7. Multi-horizon time series forecasting with temporal attention learning, in: Proceedings of the 25th ACM SIGKDD, p. 2527–2535.
  8. Sparse inverse covariance estimation with the graphical lasso. Biostatistics 9, 432–441.
  9. Dynagraph: dynamic graph neural networks at scale, in: Proceedings of the 5th ACM SIGMOD Joint International Workshop on Graph Data Management Experiences & Systems (GRADES) and Network Data Analytics (NDA), pp. 1–10.
  10. Exploring interpretable LSTM neural networks over multi-variable data 97, 2494–2504.
  11. Network comparison and the within-ensemble graph distance. Proceedings of the Royal Society A 476, 20190744.
  12. Long short-term memory. Neural computation 9, 1735–1780.
  13. Dsanet: Dual self-attention network for multivariate time series forecasting, in: ACM CIKM, pp. 2129–2132.
  14. A GAN framework-based dynamic multi-graph convolutional network for origin-destination-based ride-hailing demand prediction. Information Sciences 601, 129–146.
  15. Re-europe, a large-scale dataset for modeling a highly renewable european electricity system. Scientific data 4, 1–18.
  16. Graph neural network for traffic forecasting: A survey. arXiv preprint arXiv:2101.11174 .
  17. Graph convolutional networks meet markov random fields: Semi-supervised community detection in attribute networks, in: Proceedings of the AAAI conference on artificial intelligence, pp. 152–159.
  18. Predict then propagate: Graph neural networks meet personalized pagerank. arXiv preprint arXiv:1810.05997 .
  19. Combining neural networks with personalized pagerank for classification on graphs, in: ICLR.
  20. Evaluating predictive count data distributions in retail sales forecasting. International Journal of Forecasting 32, 788–803.
  21. Modeling long- and short-term temporal patterns with deep neural networks, in: ACM SIGIR, pp. 95–104.
  22. Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting. Advances in Neural Information Processing Systems 32, 5243–5253.
  23. Dynamic graph collaborative filtering, in: 2020 IEEE International Conference on Data Mining (ICDM), IEEE. pp. 322–331.
  24. Modeling temporal patterns with dilated convolutions for time-series forecasting. ACM Transactions on Knowledge Discovery from Data (TKDD) 16, 1–22.
  25. Dynamic affinity graph construction for spectral clustering using multiple features. IEEE transactions on neural networks and learning systems 29, 6323–6332.
  26. Temporal fusion transformers for interpretable multi-horizon time series forecasting arXiv:1912.09363.
  27. Jidt: An information-theoretic toolkit for studying the dynamics of complex systems. Frontiers in Robotics and AI 1, 11.
  28. An adaptive semisupervised feature analysis for video semantic recognition. IEEE transactions on cybernetics 48, 648–660.
  29. Effective approaches to attention-based neural machine translation, in: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, Association for Computational Linguistics, Lisbon, Portugal. pp. 1412–1421.
  30. Hierarchical structure in financial markets. The European Physical Journal B-Condensed Matter and Complex Systems 11, 193–197.
  31. Dynamic graph convolutional network for long-term traffic flow prediction with reinforcement learning. Information Sciences 578, 401–416.
  32. Spatial temporal incidence dynamic graph neural networks for traffic flow forecasting. Information Sciences 521, 277–290.
  33. Gmnn: Graph markov neural networks, in: International conference on machine learning, PMLR. pp. 5241–5250.
  34. Probabilistic logic neural networks for reasoning. Advances in neural information processing systems 32.
  35. Gaussian processes for time-series modelling. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 371.
  36. Learning bayesian networks with the bnlearn r package. arXiv preprint arXiv:0908.3817 .
  37. Temporal pattern attention for multivariate time series forecasting. 108, 1421–1441.
  38. Sequential graph collaborative filtering. Information Sciences 592, 244–260.
  39. Sequence to sequence learning with neural networks, in: Advances in neural information processing systems, pp. 3104–3112.
  40. Attention is all you need, in: Advances in Neural Information Processing Systems 30. Curran Associates, Inc., pp. 5998–6008.
  41. A comprehensive survey on graph neural networks. IEEE TNNLS , 1–21.
  42. Connecting the dots: Multivariate time series forecasting with graph neural networks, in: Proceedings of the 26th ACM SIGKDD.
  43. Show, attend and tell: Neural image caption generation with visual attention, PMLR, Lille, France. pp. 2048–2057.
  44. Spatial temporal graph convolutional networks for skeleton-based action recognition. AAAI , 7444–7452.
  45. Roland: graph learning framework for dynamic graphs, in: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 2358–2366.
  46. Maximum likelihood reconstruction for ising models with asynchronous updates. Physical review letters 110, 210601.
  47. Time series forecasting using a hybrid arima and neural network model. Neurocomputing 50, 159–175.
  48. Efficient probabilistic logic reasoning with graph neural networks. arXiv preprint arXiv:2001.11850 .
  49. Person reidentification via multi-feature fusion with adaptive graph learning. IEEE transactions on neural networks and learning systems 31, 1592–1601.
  50. Vector autoregressive models for multivariate time series, in: Modeling Financial Time Series with S-Plus®, Springer. pp. 369–413.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Abishek Sriramulu (5 papers)
  2. Nicolas Fourrier (4 papers)
  3. Christoph Bergmeir (50 papers)
Citations (17)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets