Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Graph Neural Lasso for Dynamic Network Regression (1907.11114v2)

Published 25 Jul 2019 in cs.LG, cs.NE, and stat.ML

Abstract: The regression of multiple inter-connected sequence data is a problem in various disciplines. Formally, we name the regression problem of multiple inter-connected data entities as the "dynamic network regression" in this paper. Within the problem of stock forecasting or traffic speed prediction, we need to consider both the trends of the entities and the relationships among the entities. A majority of existing approaches can't capture that information together. Some of the approaches are proposed to deal with the sequence data, like LSTM. The others use the prior knowledge in a network to get a fixed graph structure and do prediction on some unknown entities, like GCN. To overcome the limitations in those methods, we propose a novel graph neural network, namely Graph Neural Lasso (GNL), to deal with the dynamic network problem. GNL extends the GDU (gated diffusive unit) as the base neuron to capture the information behind the sequence. Rather than using a fixed graph structure, GNL can learn the dynamic graph structure automatically. By adding the attention mechanism in GNL, we can learn the dynamic relations among entities within each network snapshot. Combining these two parts, GNL is able to model the dynamic network problem well. Experimental results provided on two networked sequence datasets, i.e., Nasdaq-100 and METR-LA, show that GNL can address the network regression problem very well and is also very competitive among the existing approaches.

Citations (4)

Summary

We haven't generated a summary for this paper yet.