Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Dynamic Graph Convolutional Networks (1704.06199v1)

Published 20 Apr 2017 in cs.LG and stat.ML

Abstract: Many different classification tasks need to manage structured data, which are usually modeled as graphs. Moreover, these graphs can be dynamic, meaning that the vertices/edges of each graph may change during time. Our goal is to jointly exploit structured data and temporal information through the use of a neural network model. To the best of our knowledge, this task has not been addressed using these kind of architectures. For this reason, we propose two novel approaches, which combine Long Short-Term Memory networks and Graph Convolutional Networks to learn long short-term dependencies together with graph structure. The quality of our methods is confirmed by the promising results achieved.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Franco Manessi (8 papers)
  2. Alessandro Rozza (13 papers)
  3. Mario Manzo (7 papers)
Citations (325)

Summary

  • The paper introduces two novel architectures, Waterfall Dynamic-Graph Convolutional Network (WD-GCN) and Concatenate Dynamic-Graph Convolutional Network (CD-GCN), which integrate LSTM networks with GCNs to process dynamic graph data.
  • WD-GCN uses a novel waterfall layer with shared parameters across time steps, while CD-GCN concatenates graph convoluted features with original features, which proved particularly effective for graphs with fewer vertices.
  • Empirical evaluations show that both WD-GCN and CD-GCN outperform baseline models on DBLP and CAD-120 datasets for vertex and graph-focused tasks respectively, demonstrating their effectiveness in handling temporal dynamics.

Dynamic Graph Convolutional Networks

The paper "Dynamic Graph Convolutional Networks" presents novel methodologies aimed at tackling the challenges of processing dynamic graphs using neural network architectures. The authors introduce two new approaches that integrate Long Short-Term Memory (LSTM) networks with Graph Convolutional Networks (GCNs), specifically targeting tasks involving dynamic graph data, which exhibit changes in their structure over time.

In traditional graph classification tasks, most approaches are designed for static graphs and often fail to capture the temporal dynamics that are inherent to many real-world datasets such as social networks, traffic systems, and biological networks. To this end, the authors propose architectures that effectively exploit both the time-varying nature and the intrinsic graph structure of the data. These architectures are termed as Waterfall Dynamic-Graph Convolutional Network (WD-GCN) and Concatenate Dynamic-Graph Convolutional Network (CD-GCN).

The WD-GCN employs a novel Waterfall Dynamic-Graph Convolutional layer, which facilitates the graph convolution step at each temporal slice, utilizing shared trainable parameters across time steps. On the other hand, the CD-GCN uses a Concatenate Dynamic-Graph Convolutional layer, which extends its operation by concatenating graph convoluted features with the original vertex features. This has shown to be beneficial, particularly when dealing with graphs of smaller vertex cardinality, as highlighted in the experimental evaluation on the CAD-120 dataset.

Empirical results on two datasets, DBLP and CAD-120, demonstrate the effectiveness of these approaches. On the DBLP dataset, which demands vertex-focused applications, both WD-GCN and CD-GCN outperform baseline methodologies, achieving higher accuracy and unweighted F1 measure. These metrics illustrate these models' superior ability to handle the temporal and structural complexities of the data compared to traditional GCNs and LSTM-based models. For the graph-focused applications on the CAD-120 dataset, the CD-GCN outperforms both baselines and WD-GCN, emphasizing its utility in capturing diverse graph dynamics.

The implications of these findings underscore the potential of combining LSTMs with GCNs in a dynamic context, particularly for tasks requiring a nuanced understanding of data evolution over time. This opens possibilities for practical applications in varied domains and offers a fertile ground for further research and development. Specifically, future work could explore the extension of these models with alternative recurrent units or more sophisticated graph convolutional mechanisms, and the deployment within deeper multi-layered architectures.

Overall, the methodological advancements proposed in this paper provide a strong foundation for enhancing dynamic graph-based machine learning, positioning these models as critical tools for future explorations and innovations within the domain of temporal graph analytics.