Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Graph data augmentation with Gromow-Wasserstein Barycenters (2404.08376v1)

Published 12 Apr 2024 in cs.LG and cs.AI

Abstract: Graphs are ubiquitous in various fields, and deep learning methods have been successful applied in graph classification tasks. However, building large and diverse graph datasets for training can be expensive. While augmentation techniques exist for structured data like images or numerical data, the augmentation of graph data remains challenging. This is primarily due to the complex and non-Euclidean nature of graph data. In this paper, it has been proposed a novel augmentation strategy for graphs that operates in a non-Euclidean space. This approach leverages graphon estimation, which models the generative mechanism of networks sequences. Computational results demonstrate the effectiveness of the proposed augmentation framework in improving the performance of graph classification models. Additionally, using a non-Euclidean distance, specifically the Gromow-Wasserstein distance, results in better approximations of the graphon. This framework also provides a means to validate different graphon estimation approaches, particularly in real-world scenarios where the true graphon is unknown.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (21)
  1. Stochastic blockmodel approximation of a graphon: Theory and consistent estimation. Advances in Neural Information Processing Systems 26 (2013).
  2. Convergent sequences of dense graphs i: Subgraph frequencies, metric properties and testing. Advances in Mathematics 219, 6 (2008), 1801–1851.
  3. A consistent histogram estimator for exchangeable graph models. In International Conference on Machine Learning (2014), PMLR, pp. 208–216.
  4. Classification and estimation in the stochastic blockmodel based on the empirical degrees. Electronic Journal of Statistics 6 (2012), 2574–2601.
  5. Chatterjee, S. Matrix estimation by universal singular value thresholding. The Annals of Statistics 43, 1 (2015), 177–214.
  6. Generative diffusion models on graphs: Methods and applications. arXiv preprint arXiv:2302.02591 (2023).
  7. Quick approximation to matrices and applications. Combinatorica 19, 2 (1999), 175–220.
  8. Quasi-random words and limits of word sequences. European Journal of Combinatorics 98 (2021), 103403.
  9. G-mixup: Graph data augmentation for graph classification. In International Conference on Machine Learning (2022), PMLR, pp. 8230–8248.
  10. Training graph neural networks by graphon estimation. In 2021 IEEE International Conference on Big Data (Big Data) (2021), IEEE, pp. 5153–5162.
  11. Janson, S. Quasi-random graphs and graph limits. European Journal of Combinatorics 32, 7 (2011), 1054–1083.
  12. Score-based generative modeling of graphs via the system of stochastic differential equations. In International Conference on Machine Learning (2022), PMLR, pp. 10362–10383.
  13. Matrix completion from a few entries. IEEE transactions on information theory 56, 6 (2010), 2980–2998.
  14. Lovász, L. Large networks and graph limits, vol. 60. American Mathematical Soc., 2012.
  15. Limits of dense graph sequences. Journal of Combinatorial Theory, Series B 96, 6 (2006), 933–957.
  16. graph2vec: Learning distributed representations of graphs. arXiv preprint arXiv:1707.05005 (2017).
  17. Graphon neural networks and the transferability of graph neural networks. Advances in Neural Information Processing Systems 33 (2020), 1702–1712.
  18. Graphon signal processing. IEEE Transactions on Signal Processing 69 (2021), 4961–4976.
  19. Graphon and graph neural network stability. In ICASSP 2021-2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) (2021), IEEE, pp. 5255–5259.
  20. Learning graphons via structured gromov-wasserstein barycenters. In Proceedings of the AAAI Conference on Artificial Intelligence (2021), vol. 35, pp. 10505–10513.
  21. Graph data augmentation for graph machine learning: A survey. arXiv preprint arXiv:2202.08871 (2022).
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (1)
  1. Andrea Ponti (12 papers)