Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Beyond Spatio-Temporal Representations: Evolving Fourier Transform for Temporal Graphs (2402.16078v2)

Published 25 Feb 2024 in cs.LG

Abstract: We present the Evolving Graph Fourier Transform (EFT), the first invertible spectral transform that captures evolving representations on temporal graphs. We motivate our work by the inadequacy of existing methods for capturing the evolving graph spectra, which are also computationally expensive due to the temporal aspect along with the graph vertex domain. We view the problem as an optimization over the Laplacian of the continuous time dynamic graph. Additionally, we propose pseudo-spectrum relaxations that decompose the transformation process, making it highly computationally efficient. The EFT method adeptly captures the evolving graph's structural and positional properties, making it effective for downstream tasks on evolving graphs. Hence, as a reference implementation, we develop a simple neural model induced with EFT for capturing evolving graph spectra. We empirically validate our theoretical findings on a number of large-scale and standard temporal graph benchmarks and demonstrate that our model achieves state-of-the-art performance.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (60)
  1. The foundation of the general theory of relativity. Ann. Der Phys, 49:769–822, 1916.
  2. Analyzing the expressive power of graph neural networks in a spectral perspective. In International Conference on Learning Representations, 2020.
  3. evolve2vec: Learning network representations using temporal unfolding. In MultiMedia Modeling: 25th International Conference, MMM 2019, Thessaloniki, Greece, January 8–11, 2019, Proceedings, Part I 25, pp.  447–458. Springer, 2019.
  4. How expressive are transformers in spectral domain for graphs? Transactions on Machine Learning Research, 2022. ISSN 2835-8856. URL https://openreview.net/forum?id=aRsLetumx1.
  5. Jonathan M. Blackledge. Chapter 2 - 2d fourier theory. In Digital Image Processing, Woodhead Publishing Series in Electronic and Optical Materials, pp.  30–49. Woodhead Publishing, 2005.
  6. Spectral temporal graph neural network for multivariate time-series forecasting. Advances in neural information processing systems, 33:17766–17778, 2020.
  7. Spectral temporal graph neural network for multivariate time-series forecasting, 2021.
  8. Gc-lstm: Graph convolution embedded lstm for dynamic network link prediction. Applied Intelligence, pp.  1–16, 2022.
  9. Svd-based graph fourier transforms on directed product graphs. IEEE Transactions on Signal and Information Processing over Networks, 9:531–541, 2023. doi: 10.1109/TSIPN.2023.3299511.
  10. Regularized spectral methods for clustering signed networks. The Journal of Machine Learning Research, 22(1):12057–12135, 2021.
  11. Inductive representation learning on temporal graphs. In International Conference on Learning Representations, 2020.
  12. Convolutional neural networks on graphs with fast localized spectral filtering. Advances in neural information processing systems, 29:3844–3852, 2016.
  13. Dyngem: Deep embedding method for dynamic graphs. IJCAI Workshop on Representation Learning for Graphs, 2017.
  14. dyngraph2vec: Capturing network dynamics using dynamic graph representation learning. Knowl. Based Syst., 187, 2020.
  15. A time-vertex signal processing framework: Scalable processing and meaningful representations for time-series on graphs. IEEE Transactions on Signal Processing, 66(3):817–829, 2017.
  16. Wavelets on graphs via spectral graph theory. Applied and Computational Harmonic Analysis, 30(2):129–150, 2011.
  17. Recurrent neural networks with top-k gains for session-based recommendations. In Proceedings of the 27th ACM international conference on information and knowledge management, pp.  843–852, 2018.
  18. Temporal graph benchmark for machine learning on temporal graphs. arXiv preprint arXiv:2307.01026, 2023.
  19. Theory and design of joint time-vertex nonsubsampled filter banks. IEEE Transactions on Signal Processing, 69:1968–1982, 2021. doi: 10.1109/TSP.2021.3064984.
  20. Learning shared representations for recommendation with dynamic heterogeneous graph convolutional networks. ACM Transactions on Knowledge Discovery from Data (TKDD), 2022.
  21. Self-attentive sequential recommendation. In 2018 IEEE international conference on data mining (ICDM), pp.  197–206. IEEE, 2018.
  22. Joint time-vertex fractional fourier transform, 2022.
  23. Representation learning for dynamic graphs: A survey. J. Mach. Learn. Res., 21(70):1–73, 2020.
  24. Semi-supervised classification with graph convolutional networks. In 5th International Conference on Learning Representations, ICLR 2017, 2017.
  25. Multi-dimensional graph fourier transform, 2017.
  26. P. Lancaster and H. K. Farahat. Norms on direct sums and tensor products. Mathematics of Computation, 26(118):401–414, 1972. ISSN 00255718, 10886842.
  27. Time interval aware self-attention for sequential recommendation. In Proceedings of the 13th international conference on web search and data mining, pp.  322–330, 2020a.
  28. Dynamic graph collaborative filtering. In 2020 IEEE International Conference on Data Mining (ICDM), pp.  322–331. IEEE, 2020b.
  29. Frequency analysis of time-varying graph signals. In 2016 IEEE Global Conference on Signal and Information Processing (GlobalSIP), pp.  346–350. IEEE, 2016.
  30. Hierarchical gating networks for sequential recommendation. In Proceedings of the 25th ACM SIGKDD international conference on knowledge discovery & data mining, pp.  825–833, 2019.
  31. Streaming graph neural networks. In Proceedings of the 43rd international ACM SIGIR conference on research and development in information retrieval, pp.  719–728, 2020.
  32. Fourier transform for signals on dynamic graphs. In 2014 48th Asilomar Conference on Signals, Systems and Computers, pp.  2001–2004. IEEE, 2014.
  33. Dynamic graph convolutional networks. Pattern Recognit., 97, 2020.
  34. Clustering signed networks with the geometric mean of laplacians. Advances in neural information processing systems, 29, 2016.
  35. Learning graph dynamics using deep neural networks. IFAC-PapersOnLine, 51(2):433–438, 2018.
  36. Graph signal processing: Overview, challenges, and applications. Proceedings of the IEEE, 106(5):808–828, 2018.
  37. Spatio-temporal graph scattering transform. In International Conference on Learning Representations, 2020.
  38. Evolvegcn: Evolving graph convolutional networks for dynamic graphs. In The Thirty-Fourth AAAI Conference on Artificial Intelligence, AAAI 2020, pp.  5363–5370. AAAI Press, 2020.
  39. Automatic differentiation in pytorch. 2017.
  40. Bpr: Bayesian personalized ranking from implicit feedback. In Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence, pp.  452–461, 2009.
  41. Factorizing personalized markov chains for next-basket recommendation. In Proceedings of the 19th international conference on World wide web, pp.  811–820, 2010.
  42. Temporal graph networks for deep learning on dynamic graphs. arXiv preprint arXiv:2006.10637, 2020.
  43. Nonparametric link prediction in dynamic networks. arXiv preprint arXiv:1206.6394, 2012.
  44. Structured sequence modeling with graph convolutional recurrent networks. CoRR, abs/1612.07659, 2016. URL http://arxiv.org/abs/1612.07659.
  45. Gaen: Graph attention evolving networks. In IJCAI, pp.  1541–1547, 2021.
  46. The emerging field of signal processing on graphs: Extending high-dimensional data analysis to networks and other irregular domains. IEEE signal processing magazine, 30(3):83–98, 2013.
  47. Duraisamy Sundararajan. The discrete fourier transform. In Signals and Systems, pp.  125–160. Springer, 2023.
  48. Jiaxi Tang and Ke Wang. Personalized top-n sequential recommendation via convolutional sequence embedding. In Proceedings of the eleventh ACM international conference on web search and data mining, pp.  565–573, 2018.
  49. Terrence Tao. When are eigenvalues stable?, 2008. URL https://terrytao.wordpress.com/2008/10/28/when-are-eigenvalues-stable/.
  50. Graph attention networks. In International Conference on Learning Representations, 2018.
  51. Dynamic graph fourier transform on temporal functional connectivity networks. In 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp.  949–953, 2017.
  52. Next-item recommendation with sequential hypergraphs. In Proceedings of the 43rd international ACM SIGIR conference on research and development in information retrieval, pp.  1101–1110, 2020.
  53. Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315, 2019.
  54. How powerful are spectral graph neural networks. In International Conference on Machine Learning, pp.  23341–23362. PMLR, 2022.
  55. Session-based recommendation with graph neural networks. In Proceedings of the AAAI conference on artificial intelligence, volume 33, pp.  346–353, 2019.
  56. Learning to evolve on dynamic graphs (sa). In Thirty-Sixth AAAI Conference on Artificial Intelligence, AAAI 2022. AAAI Press, 2022. URL https://arxiv.org/pdf/2111.07032.pdf.
  57. Spatial temporal graph convolutional networks for skeleton-based action recognition. In Proceedings of the AAAI conference on artificial intelligence, volume 32, 2018.
  58. Spatio-temporal graph convolutional networks: A deep learning framework for traffic forecasting. arXiv preprint arXiv:1709.04875, 2017.
  59. Dynamic graph neural networks for sequential recommendation. IEEE Transactions on Knowledge and Data Engineering, 2022.
  60. Filter-enhanced mlp is all you need for sequential recommendation. In Proceedings of the ACM Web Conference 2022, pp.  2388–2399, 2022.
Citations (1)

Summary

We haven't generated a summary for this paper yet.