Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph Neural Networks (2003.03777v5)

Published 8 Mar 2020 in cs.LG, cs.SY, eess.SY, and stat.ML

Abstract: Network data can be conveniently modeled as a graph signal, where data values are assigned to nodes of a graph that describes the underlying network topology. Successful learning from network data is built upon methods that effectively exploit this graph structure. In this work, we leverage graph signal processing to characterize the representation space of graph neural networks (GNNs). We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology. These two properties offer insight about the workings of GNNs and help explain their scalability and transferability properties which, coupled with their local and distributed nature, make GNNs powerful tools for learning in physical networks. We also introduce GNN extensions using edge-varying and autoregressive moving average graph filters and discuss their properties. Finally, we study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Fernando Gama (43 papers)
  2. Elvin Isufi (57 papers)
  3. Geert Leus (98 papers)
  4. Alejandro Ribeiro (281 papers)
Citations (138)

Summary

We haven't generated a summary for this paper yet.