Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
129 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Elevating Spectral GNNs through Enhanced Band-pass Filter Approximation (2404.15354v1)

Published 15 Apr 2024 in eess.SP, cs.AI, cs.LG, cs.NA, and math.NA

Abstract: Spectral Graph Neural Networks (GNNs) have attracted great attention due to their capacity to capture patterns in the frequency domains with essential graph filters. Polynomial-based ones (namely poly-GNNs), which approximately construct graph filters with conventional or rational polynomials, are routinely adopted in practice for their substantial performances on graph learning tasks. However, previous poly-GNNs aim at achieving overall lower approximation error on different types of filters, e.g., low-pass and high-pass, but ignore a key question: \textit{which type of filter warrants greater attention for poly-GNNs?} In this paper, we first show that poly-GNN with a better approximation for band-pass graph filters performs better on graph learning tasks. This insight further sheds light on critical issues of existing poly-GNNs, i.e., those poly-GNNs achieve trivial performance in approximating band-pass graph filters, hindering the great potential of poly-GNNs. To tackle the issues, we propose a novel poly-GNN named TrigoNet. TrigoNet constructs different graph filters with novel trigonometric polynomial, and achieves leading performance in approximating band-pass graph filters against other polynomials. By applying Taylor expansion and deserting nonlinearity, TrigoNet achieves noticeable efficiency among baselines. Extensive experiments show the advantages of TrigoNet in both accuracy performances and efficiency.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (84)
  1. Achieser, N. I. Theory of approximation. Courier Corporation, 2013.
  2. Analyzing the expressive power of graph neural networks in a spectral perspective. In International Conference on Learning Representations, 2021. URL https://openreview.net/forum?id=-qh0M9XWxnv.
  3. Graph neural networks with convolutional arma filters. IEEE Transactions on Pattern Analysis and Machine Intelligence, 44(7):3496–3507, 2020. doi: 10.1109/TPAMI.2021.3054830.
  4. A survey on spectral graph neural networks, 2023.
  5. Fourier transforms. The Mathematical Gazette, 35(312):140–141, 1951. doi: 10.2307/3609365.
  6. Approximating sine functions using variable-precision taylor polynomials. In 2009 IEEE Workshop on Signal Processing Systems, pp.  57–62, 2009. doi: 10.1109/SIPS.2009.5336225.
  7. Scalable graph neural networks via bidirectional propagation. In Advances in Neural Information Processing Systems, volume 33, pp.  14556–14566. Curran Associates, Inc., 2020. URL https://proceedings.neurips.cc/paper_files/paper/2020/file/a7789ef88d599b8df86bbee632b2994d-Paper.pdf.
  8. Rational neural networks for approximating graph convolution operator on jump discontinuities. In 2018 IEEE International Conference on Data Mining (ICDM), pp.  59–68, 2018. doi: 10.1109/ICDM.2018.00021.
  9. Supervised community detection with line graph neural networks. In International Conference on Learning Representations, 2019. URL https://openreview.net/forum?id=H1g0Z3A9Fm.
  10. Bridging the gap between spatial and spectral domains: A survey on graph neural networks, 2021.
  11. Adaptive universal generalized pagerank graph neural network. In International Conference on Learning Representations, 2021. URL https://openreview.net/forum?id=n6jl7fLxrP.
  12. Chihara, T. S. An introduction to orthogonal polynomials. Courier Corporation, 2011.
  13. Chung, F. Spectral Graph Theory, volume 92. CBMS Regional Conference Series in Mathematics, 1997. ISBN 978-0-8218-0315-8. doi: /10.1090/cbms/092.
  14. Linear matrix inequality formulation of spectral mask constraints with applications to fir filter design. IEEE Transactions on Signal Processing, 50(11):2702–2715, 2002. doi: 10.1109/TSP.2002.804079.
  15. On multivariate polynomial interpolation. Constructive Approximation, 6:287–302, 1990.
  16. Convolutional neural networks on graphs with fast localized spectral filtering. In Lee, D., Sugiyama, M., Luxburg, U., Guyon, I., and Garnett, R. (eds.), Advances in Neural Information Processing Systems, volume 29. Curran Associates, Inc., 2016.
  17. Graph signal processing for machine learning: A review and new perspectives. IEEE Signal Processing Magazine, 37(6):117–127, 2020. doi: 10.1109/MSP.2020.3014591.
  18. Fast graph representation learning with pytorch geometric, 2019. URL https://arxiv.org/abs/1903.02428.
  19. Protein interface prediction using graph convolutional networks. In Advances in Neural Information Processing Systems, volume 30. Curran Associates, Inc., 2017. URL https://proceedings.neurips.cc/paper/2017/file/f507783927f2ec2737ba40afbd17efb5-Paper.pdf.
  20. Sign: Scalable inception graph neural networks, 2020.
  21. Design of an improved interpolation filter using a trigonometric polynomial. In 1999 IEEE International Symposium on Circuits and Systems (ISCAS), volume 4, pp.  363–366 vol.4, 1999. doi: 10.1109/ISCAS.1999.780017.
  22. Funaro, D. Polynomial approximation of differential equations, volume 8. Springer Science & Business Media, 2008.
  23. Graphs, convolutions, and neural networks: From graph filters to graph neural networks. IEEE Signal Processing Magazine, 37(6):128–138, 2020. doi: 10.1109/MSP.2020.3016143.
  24. Polynomial interpolation in several variables. Advances in Computational Mathematics, 12:377–410, 2000.
  25. Predict then propagate: Graph neural networks meet personalized pagerank. In International Conference on Learning Representations, 2019. URL https://openreview.net/forum?id=H1gL-2A9Ym.
  26. G.Proakis, J. Digital signal processing. IEEE Transactions on Acoustics, Speech, and Signal Processing, 23(4):392–394, 1975. doi: 10.1109/TASSP.1975.1162707.
  27. Linear phase fir filter for narrow-band filtering. In 2008 International Conference on Communications, Circuits and Systems, pp.  776–779, 2008. doi: 10.1109/ICCCAS.2008.4657886.
  28. Graph neural networks with diverse spectral filtering. In Proceedings of the ACM Web Conference 2023, pp.  306–316, New York, NY, USA, 2023. Association for Computing Machinery. ISBN 9781450394161. doi: 10.1145/3543507.3583324. URL https://doi.org/10.1145/3543507.3583324.
  29. Graph neural networks with learnable and optimal polynomial bases. In Krause, A., Brunskill, E., Cho, K., Engelhardt, B., Sabato, S., and Scarlett, J. (eds.), Proceedings of the 40th International Conference on Machine Learning, volume 202 of Proceedings of Machine Learning Research, pp.  12077–12097. PMLR, 23–29 Jul 2023. URL https://proceedings.mlr.press/v202/guo23i.html.
  30. Bernnet: Learning arbitrary graph spectral filters via bernstein approximation. In Beygelzimer, A., Dauphin, Y., Liang, P., and Vaughan, J. W. (eds.), Advances in Neural Information Processing Systems, 2021. URL https://openreview.net/forum?id=WigDnV-_Gq.
  31. Convolutional neural networks on graphs with chebyshev approximation, revisited. In Oh, A. H., Agarwal, A., Belgrave, D., and Cho, K. (eds.), Advances in Neural Information Processing Systems, 2022. URL https://openreview.net/forum?id=jxPJ4QA0KAb.
  32. Lightgcn: Simplifying and powering graph convolution network for recommendation, 2020. URL https://arxiv.org/abs/2002.02126.
  33. Hildebrand, F. B. Introduction to numerical analysis. Courier Corporation, 1987.
  34. Hoskins, R. F. Delta functions: Introduction to generalised functions. Horwood Publishing, 2009. ISBN 978-1-904275-39-8.
  35. Open graph benchmark: Datasets for machine learning on graphs. In Advances in Neural Information Processing Systems, volume 33, pp.  22118–22133, 2020. URL https://proceedings.neurips.cc/paper_files/paper/2020/file/fb60d411a5c5b72b2e7d3527cfc84fd0-Paper.pdf.
  36. Assessing the chaos strength of taylor approximations of the sine chaotic map. Nonlinear Dynamics, 111:2755–2778, 2023. doi: 10.1007/s11071-022-07929-y.
  37. Autoregressive moving average graph filtering. IEEE Transactions on Signal Processing, 65(2):274–288, 2017. doi: 10.1109/TSP.2016.2614793.
  38. The symmetric table addition method for accurate function approximation. Journal of VLSI signal processing systems for signal, image and video technology, 21:167–177, 1999. doi: 10.1023/A:1008004523235.
  39. Adam: A method for stochastic optimization, 2014. URL https://arxiv.org/abs/1412.6980.
  40. Semi-supervised classification with graph convolutional networks. In International Conference on Learning Representations, 2017. URL https://openreview.net/forum?id=SJU4ayYgl.
  41. Some results on taylor-series function approximation on fpga. In The Thrity-Seventh Asilomar Conference on Signals, Systems and Computers, 2003, volume 2, pp.  2198–2202, 2003. doi: 10.1109/ACSSC.2003.1292370.
  42. Cayleynets: Graph convolutional neural networks with complex rational spectral filters. IEEE Transactions on Signal Processing, 67(1):97–109, 2019. doi: 10.1109/TSP.2018.2879624.
  43. G22{}^{2}start_FLOATSUPERSCRIPT 2 end_FLOATSUPERSCRIPTCN: Graph Gaussian convolution networks with concentrated graph filters. In Chaudhuri, K., Jegelka, S., Song, L., Szepesvari, C., Niu, G., and Sabato, S. (eds.), Proceedings of the 39th International Conference on Machine Learning, volume 162 of Proceedings of Machine Learning Research, pp.  12782–12796. PMLR, Jul 2022. URL https://proceedings.mlr.press/v162/li22h.html.
  44. Label efficient semi-supervised learning via graph filtering. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), June 2019.
  45. Litvinov, G. Approximate construction of rational approximations and the effect of error autocorrection. applications, 2001.
  46. Lorentz, G. G. Bernstein polynomials. American Mathematical Soc., 2013. ISBN 978-0-8218-7558-2.
  47. Accurate and efficient computation of nonlocal potentials based on gaussian-sum approximation. Journal of Computational Physics, 327:629–642, 2016. doi: 10.1016/j.jcp.2016.09.045.
  48. Mason, J. Some properties and applications of chebyshev polynomial and rational approximation. In Rational Approximation and Interpolation: Proceedings of the United Kingdom-United States Conference held at Tampa, Florida, December 12–16, 1983, pp.  27–48. Springer, 2006.
  49. A taylor expansion of the square root matrix function. Journal of Mathematical Analysis and Applications, 465(1):259–266, 2018. doi: 10.1016/j.jmaa.2018.05.005.
  50. Newman, D. J. Approximation with rational functions. Number 41. American Mathematical Soc., 1979.
  51. Hardware implementation of the exponential function using taylor series. In 2014 NORCHIP, pp.  1–4, 2014. doi: 10.1109/NORCHIP.2014.7004740.
  52. Graph signal processing: Overview, challenges, and applications. Proceedings of the IEEE, 106(5):808–828, 2018. doi: 10.1109/JPROC.2018.2820126.
  53. Shape-constrained estimation using nonnegative splines. Journal of Computational and graphical Statistics, 23(1):211–231, 2014.
  54. Patanè, G. Fourier-based and rational graph filters for spectral processing. IEEE Transactions on Pattern Analysis and Machine Intelligence, 45(6):7063–7074, 2023. doi: 10.1109/TPAMI.2022.3177075.
  55. Taylor approximation and variance reduction for pde-constrained optimal control under uncertainty. Journal of Computational Physics, 385:163–186, 2019. doi: 10.1016/j.jcp.2019.01.047.
  56. Phillips, G. M. Interpolation and approximation by polynomials, volume 14. Springer New York, 2003. ISBN 978-0-387-00215-6. doi: 10.1007/b97417.
  57. Popov, V. A. Rational approximation of real functions. Number 28. Cambridge University Press, 2011.
  58. Deepinf: Social influence prediction with deep learning. In Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD’18), 2018.
  59. Rational chebyshev graph filters. In 2020 54th Asilomar Conference on Signals, Systems, and Computers, pp.  736–740, 2020. doi: 10.1109/IEEECONF51394.2020.9443317.
  60. Rivlin, T. J. Chebyshev polynomials. Courier Dover Publications, 2020.
  61. Understanding non-linearity in graph neural networks from the bayesian-inference perspective. In Advances in Neural Information Processing Systems, 2022. URL https://openreview.net/forum?id=Xt9smkoTgQf.
  62. Multi-scale attributed node embedding. Journal of Complex Networks, 9(2), 05 2021. doi: 10.1093/comnet/cnab014. URL https://doi.org/10.1093/comnet/cnab014.
  63. Discrete signal processing on graphs. IEEE Transactions on Signal Processing, 61(7):1644–1656, 2013a. doi: 10.1109/TSP.2013.2238935.
  64. Discrete signal processing on graphs: Graph fourier transform. In 2013 IEEE International Conference on Acoustics, Speech and Signal Processing, pp.  6167–6170, 2013b. doi: 10.1109/ICASSP.2013.6638850.
  65. Trigonometric interpolation. Duke Mathematical Journal, 32(2):341–357, 1965. doi: 10.1215/S0012-7094-65-03235-7. URL https://doi.org/10.1215/S0012-7094-65-03235-7.
  66. Pitfalls of graph neural network evaluation, 2019.
  67. Topics in graph signal processing: Convolution and modulation. In 2019 53rd Asilomar Conference on Signals, Systems, and Computers, pp.  457–461, 2019. doi: 10.1109/IEEECONF44664.2019.9049012.
  68. The emerging field of signal processing on graphs: Extending high-dimensional data analysis to networks and other irregular domains. IEEE Signal Processing Magazine, 30(3):83–98, 2013. doi: 10.1109/MSP.2012.2235192.
  69. Smyth, G. K. Polynomial approximation. Encyclopedia of Biostatistics, 13, 1998.
  70. Szeg, G. Orthogonal polynomials, volume 23. American Mathematical Soc., 1939.
  71. Optimal surface smoothing as filter design. In ECCV ’96, pp.  283–292. Springer Berlin Heidelberg, 1996.
  72. Venables, W. Modern applied statistics with S-PLUS. Springer Science and Business Media, 2013.
  73. How powerful are spectral graph neural networks. In Chaudhuri, K., Jegelka, S., Song, L., Szepesvari, C., Niu, G., and Sabato, S. (eds.), Proceedings of the 39th International Conference on Machine Learning, volume 162 of Proceedings of Machine Learning Research, pp.  23341–23362. PMLR, Jul 2022. URL https://proceedings.mlr.press/v162/wang22am.html.
  74. Simplifying graph convolutional networks. In Proceedings of the 36th International Conference on Machine Learning, volume 97, pp.  6861–6871. PMLR, 06 2019. URL https://proceedings.mlr.press/v97/wu19e.html.
  75. A comprehensive survey on graph neural networks. IEEE Transactions on Neural Networks and Learning Systems, 32(1):4–24, 2021. doi: 10.1109/TNNLS.2020.2978386.
  76. Graph meta network for multi-behavior recommendation. In The 44th International ACM SIGIR Conference on Research and Development in Information Retrieval, pp.  757–766, 2021. doi: 10.1145/3404835.3462972.
  77. Revisiting semi-supervised learning with graph embeddings, 2016. URL https://arxiv.org/abs/1603.08861.
  78. Perfect decomposition narrow-band fir filter banks. IEEE Transactions on Circuits and Systems II: Express Briefs, 59(11):805–809, 2012. doi: 10.1109/TCSII.2012.2218453.
  79. Learning skeletal graph neural networks for hard 3d pose estimation. In 2021 IEEE/CVF International Conference on Computer Vision (ICCV), pp.  11416–11425, 2021. doi: 10.1109/ICCV48922.2021.01124.
  80. Node dependent local smoothing for scalable graph learning. In Advances in Neural Information Processing Systems, volume 34, pp.  20321–20332. Curran Associates, Inc., 2021. URL https://proceedings.neurips.cc/paper_files/paper/2021/file/a9eb812238f753132652ae09963a05e9-Paper.pdf.
  81. Semantic graph convolutional networks for 3d human pose regression. In 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pp.  3420–3430, 2019. doi: 10.1109/CVPR.2019.00354.
  82. Minimax design of two-channel critically sampled graph qmf banks. Signal Processing, 212:109129, 2023. ISSN 0165-1684. doi: https://doi.org/10.1016/j.sigpro.2023.109129. URL https://www.sciencedirect.com/science/article/pii/S0165168423002037.
  83. Shtukas and the taylor expansion of l𝑙litalic_l-functions. Annals of Mathematics, 186(3):767–911, 2017. doi: 10.4007/annals.2017.186.3.2.
  84. Trigonometric Series. Cambridge Mathematical Library. Cambridge University Press, 3 edition, 2003. doi: 10.1017/CBO9781316036587.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets