Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
162 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Topo-MLP : A Simplicial Network Without Message Passing (2312.11862v1)

Published 19 Dec 2023 in cs.LG, cs.AI, cs.CV, and stat.ML

Abstract: Due to their ability to model meaningful higher order relations among a set of entities, higher order network models have emerged recently as a powerful alternative for graph-based network models which are only capable of modeling binary relationships. Message passing paradigm is still dominantly used to learn representations even for higher order network models. While powerful, message passing can have disadvantages during inference, particularly when the higher order connectivity information is missing or corrupted. To overcome such limitations, we propose Topo-MLP, a purely MLP-based simplicial neural network algorithm to learn the representation of elements in a simplicial complex without explicitly relying on message passing. Our framework utilizes a novel Higher Order Neighborhood Contrastive (HONC) loss which implicitly incorporates the simplicial structure into representation learning. Our proposed model's simplicity makes it faster during inference. Moreover, we show that our model is robust when faced with missing or corrupted connectivity structure.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (38)
  1. C. Bick, E. Gross, H. A. Harrington, and M. T. Schaub, “What are higher-order networks?,” arXiv:2104.11329, 2021.
  2. F. Battiston et al., “The physics of higher-order interactions in complex systems,” Nature Physics, vol. 17, no. 10, pp. 1093–1098, 2021.
  3. Z. Xiao and Y. Deng, “Graph embedding-based novel protein interaction prediction via higher-order graph convolutional network,” PloS one, vol. 15, no. 9, p. e0238915, 2020.
  4. T. M. Roddenberry, N. Glaze, and S. Segarra, “Principled simplicial neural networks for trajectory prediction,” in Proc. ICML, pp. 9020–9029, 2021.
  5. E. Bunch, Q. You, G. Fung, and V. Singh, “Simplicial 2-complex convolutional neural nets,” NeurIPS workshop on Topological Data Analysis and Beyond, 2020.
  6. M. T. Schaub, A. R. Benson, P. Horn, G. Lippner, and A. Jadbabaie, “Random walks on simplicial complexes and the normalized Hodge 1-Laplacian,” SIAM Review, vol. 62, no. 2, pp. 353–391, 2020.
  7. Y. Feng, H. You, Z. Zhang, R. Ji, and Y. Gao, “Hypergraph neural networks,” Proc. AAAI, vol. 33, no. 01, pp. 3558–3565, 2019.
  8. T. M. Roddenberry and S. Segarra, “HodgeNet: Graph neural networks for edge data,” in Proc. Asilomar Conf. SSC, pp. 220–224, 2019.
  9. S. Ebli, M. Defferrard, and G. Spreemann, “Simplicial neural networks,” NeurIPS workshop on Topological Data Analysis and Beyond, 2020.
  10. M. Fey and J. E. Lenssen, “Fast graph representation learning with PyTorch Geometric,” arXiv:1903.02428, 2019.
  11. C. Morris et al., “Weisfeiler and Leman go neural: Higher-order graph neural networks,” Proc. AAAI, vol. 33, no. 01, pp. 4602–4609, 2019.
  12. T. N. Kipf and M. Welling, “Semi-supervised classification with graph convolutional networks,” in Proc. ICLR, 2017.
  13. M. T. Schaub, Y. Zhu, J.-B. Seby, T. M. Roddenberry, and S. Segarra, “Signal processing on higher-order networks: Livin’on the edge… and beyond,” Signal Processing, vol. 187, p. 108149, 2021.
  14. A. Hatcher, Algebraic topology. Cambridge University Press, 2005.
  15. M. Hajij, K. Istvan, and G. Zamzmi, “Cell complex neural networks,” NeurIPS Workshop TDA and Beyond, 2020.
  16. A. Loukas, “What graph neural networks cannot learn: depth vs width,” in Proc. ICLR, 2020.
  17. P. W. Battaglia et al., “Relational inductive biases, deep learning, and graph networks,” arXiv:1806.01261, 2018.
  18. P. W. Battaglia, R. Pascanu, M. Lai, D. Rezende, and K. Kavukcuoglu, “Interaction networks for learning about objects, relations and physics,” in Proc. NeurIPS, p. 4509–4517, 2016.
  19. Y. Hu et al., “Graph-MLP: node classification without message passing in graph,” arXiv:2106.04051, 2021.
  20. J. Gilmer, S. S. Schoenholz, P. F. Riley, O. Vinyals, and G. E. Dahl, “Neural message passing for quantum chemistry,” in Proc. ICML, pp. 1263–1272, 2017.
  21. P. Veličković et al., “Graph attention networks,” in Proc. ICLR, 2018.
  22. M. Hajij, K. N. Ramamurthy, A. Guzmán-Sáenz, and G. Zamzmi, “High skip networks: A higher order generalization of skip connections,” in ICLR Workshop on Geometrical and Topological Representation Learning, 2022.
  23. M. Hajij, G. Zamzmi, T. Papamarkou, V. Maroulas, and X. Cai, “Simplicial complex representation learning,” Machine Learning on Graphs (MLoG) Workshop at ACM International WSD Conference, 2022.
  24. A. D. Keros, V. Nanda, and K. Subr, “Dist2cycle: A simplicial neural network for homology localization,” in Proc. AAAI, vol. 36, pp. 7133–7142, 2022.
  25. S. Majhi, M. Perc, and D. Ghosh, “Dynamics on higher-order networks: A review,” Journal of the Royal Society Interface, vol. 19, no. 188, p. 20220043, 2022.
  26. Y. Chen, Y. R. Gel, and H. V. Poor, “BScNets: Block simplicial complex neural networks,” in Proc. AAAI, vol. 36, pp. 6333–6341, 2022.
  27. L. Giusti, C. Battiloro, P. Di Lorenzo, S. Sardellitti, and S. Barbarossa, “Simplicial attention neural networks,” arXiv preprint arXiv:2203.07485, 2022.
  28. M. Hajij et al., “Higher-order attention networks,” arXiv:2206.00606, 2022.
  29. M. Yang, E. Isufi, and G. Leus, “Simplicial convolutional neural networks,” in Proc. IEEE ICASSP, pp. 8847–8851, 2022.
  30. Springer, 2010.
  31. T. M. Roddenberry, M. T. Schaub, and M. Hajij, “Signal processing on cell complexes,” Proc. IEEE ICASSP, 2022.
  32. A. Ortega, P. Frossard, J. Kovačević, J. M. Moura, and P. Vandergheynst, “Graph signal processing: Overview, challenges, and applications,” Proc. IEEE, vol. 106, no. 5, pp. 808–828, 2018.
  33. H.-J. Bandelt and V. Chepoi, “Metric graph theory and geometry: a survey,” Contemporary Mathematics, vol. 453, pp. 49–86, 2008.
  34. D. Hendrycks and K. Gimpel, “Gaussian error linear units (GELUs),” arXiv:1606.08415, 2016.
  35. D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” arXiv:1412.6980, 2014.
  36. B. Perozzi, R. Al-Rfou, and S. Skiena, “Deepwalk: Online learning of social representations,” in Proc. KDD Conference, pp. 701–710, 2014.
  37. X. Zhao et al., “Multi-scale deep graph convolutional networks for intelligent fault diagnosis of rotor-bearing system under fluctuating working conditions,” IEEE Transactions on Industrial Informatics, 2022.
  38. F. Wu et al., “Simplifying graph convolutional networks,” in Proc. ICML, pp. 6861–6871, 2019.
Citations (8)

Summary

We haven't generated a summary for this paper yet.