Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
Gemini 2.5 Pro
GPT-5
GPT-4o
DeepSeek R1 via Azure
2000 character limit reached

Sparse Implementation of Versatile Graph-Informed Layers (2403.13781v1)

Published 20 Mar 2024 in cs.LG, cs.NA, and math.NA

Abstract: Graph Neural Networks (GNNs) have emerged as effective tools for learning tasks on graph-structured data. Recently, Graph-Informed (GI) layers were introduced to address regression tasks on graph nodes, extending their applicability beyond classic GNNs. However, existing implementations of GI layers lack efficiency due to dense memory allocation. This paper presents a sparse implementation of GI layers, leveraging the sparsity of adjacency matrices to reduce memory usage significantly. Additionally, a versatile general form of GI layers is introduced, enabling their application to subsets of graph nodes. The proposed sparse implementation improves the concrete computational efficiency and scalability of the GI layers, permitting to build deeper Graph-Informed Neural Networks (GINNs) and facilitating their scalability to larger graphs.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (10)
  1. ``Graph-Informed Neural Networks for Regressions on Graph-Structured Data'' In Mathematics 10.5 MDPI AG, 2022, pp. 786 DOI: 10.3390/math10050786
  2. Ronan Collobert, Koray Kavukcuoglu and Clèment Farabet ``Torch7: AMatlab-like environ- ment for machine learning'' In BigLearn, NIPS Workshop, number EPFL-CONF-192376, 2011
  3. Francesco Della Santa and Sandra Pieraccini ``Graph-Informed Neural Networks for Sparse Grid-Based Discontinuity Detectors'', 2024 arXiv:2401.13652 [cs.LG]
  4. M. Gori, G. Monfardini and F. Scarselli ``A new model for learning in graph domains'' In Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005. 2, 2005, pp. 729–734 vol. 2 DOI: 10.1109/IJCNN.2005.1555942
  5. ``TensorFlow: Large-Scale Machine Learning on Heterogeneous Systems'' Software available from tensorflow.org, 2015 URL: https://www.tensorflow.org/
  6. Alessio Micheli ``Neural Network for Graphs: A Contextual Constructive Approach'' In IEEE Transactions on Neural Networks 20.3, 2009, pp. 498–511 DOI: 10.1109/TNN.2008.2010350
  7. ``The Graph Neural Network Model'' In IEEE Transactions on Neural Networks 20.1, 2009, pp. 61–80 DOI: 10.1109/TNN.2008.2005605
  8. ``Tensorflow - Module: tf.sparse'' (Accessed on January 2024) URL: https://www.tensorflow.org/api_docs/python/tf/sparse
  9. ``SciPy 1.0: Fundamental Algorithms for Scientific Computing in Python'' In Nature Methods 17, 2020, pp. 261–272 DOI: 10.1038/s41592-019-0686-2
  10. ``A Comprehensive Survey on Graph Neural Networks'' In IEEE Transactions on Neural Networks and Learning Systems 32.1, 2021, pp. 4–24 DOI: 10.1109/TNNLS.2020.2978386

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

Authors (1)

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets