Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

LAVARNET: Neural Network Modeling of Causal Variable Relationships for Multivariate Time Series Forecasting (2009.00945v1)

Published 2 Sep 2020 in cs.LG, cs.NE, and stat.ML

Abstract: Multivariate time series forecasting is of great importance to many scientific disciplines and industrial sectors. The evolution of a multivariate time series depends on the dynamics of its variables and the connectivity network of causal interrelationships among them. Most of the existing time series models do not account for the causal effects among the system's variables and even if they do they rely just on determining the between-variables causality network. Knowing the structure of such a complex network and even more specifically knowing the exact lagged variables that contribute to the underlying process is crucial for the task of multivariate time series forecasting. The latter is a rather unexplored source of information to leverage. In this direction, here a novel neural network-based architecture is proposed, termed LAgged VAriable Representation NETwork (LAVARNET), which intrinsically estimates the importance of lagged variables and combines high dimensional latent representations of them to predict future values of time series. Our model is compared with other baseline and state of the art neural network architectures on one simulated data set and four real data sets from meteorology, music, solar activity, and finance areas. The proposed architecture outperforms the competitive architectures in most of the experiments.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Christos Koutlis (27 papers)
  2. Symeon Papadopoulos (74 papers)
  3. Manos Schinas (6 papers)
  4. Ioannis Kompatsiaris (42 papers)
Citations (15)

Summary

We haven't generated a summary for this paper yet.