Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

How Powerful are Spectral Graph Neural Networks (2205.11172v2)

Published 23 May 2022 in cs.LG and cs.AI

Abstract: Spectral Graph Neural Network is a kind of Graph Neural Network (GNN) based on graph signal filters. Some models able to learn arbitrary spectral filters have emerged recently. However, few works analyze the expressive power of spectral GNNs. This paper studies spectral GNNs' expressive power theoretically. We first prove that even spectral GNNs without nonlinearity can produce arbitrary graph signals and give two conditions for reaching universality. They are: 1) no multiple eigenvalues of graph Laplacian, and 2) no missing frequency components in node features. We also establish a connection between the expressive power of spectral GNNs and Graph Isomorphism (GI) testing, the latter of which is often used to characterize spatial GNNs' expressive power. Moreover, we study the difference in empirical performance among different spectral GNNs with the same expressive power from an optimization perspective, and motivate the use of an orthogonal basis whose weight function corresponds to the graph signal density in the spectrum. Inspired by the analysis, we propose JacobiConv, which uses Jacobi basis due to its orthogonality and flexibility to adapt to a wide range of weight functions. JacobiConv deserts nonlinearity while outperforming all baselines on both synthetic and real-world datasets.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Xiyuan Wang (43 papers)
  2. Muhan Zhang (89 papers)
Citations (158)

Summary

  • The paper demonstrates that linear SGNNs can achieve universal expressiveness under specific conditions, challenging the necessity of nonlinearity.
  • The study introduces JacobiConv, which employs Jacobi polynomial bases and Polynomial Coefficient Decomposition to boost convergence and adapt to graph data.
  • Empirical results show JacobiConv outperforms competitors by up to 12%, highlighting its efficiency and potential for practical graph applications.

Analyzing the Expressive Power and Optimization of Spectral Graph Neural Networks

The paper "How Powerful are Spectral Graph Neural Networks" by Xiyuan Wang and Muhan Zhang provides a theoretical exploration into the expressive power and optimization capabilities of Spectral Graph Neural Networks (SGNNs). SGNNs have garnered attention for their ability to leverage graph signal filters in the spectral domain. This work systematically analyzes SGNNs' expressive power and addresses several underexplored aspects of their design and applicability, such as the role of nonlinearity, the impact of different polynomial bases, and optimization efficiency. Furthermore, the authors introduce a novel spectral GNN architecture, JacobiConv, which features significant improvements in expressiveness and performance on real-world datasets.

The paper starts by establishing that linear SGNNs, which exclude nonlinearity, can still achieve all necessary expressive power under certain conditions. By demonstrating that linear transformations are sufficient to represent a wide array of graph signals, the paper challenges the prevailing view of nonlinearity as essential to GNN expressiveness. The authors specify two universality conditions essential for universal approximation: absence of multiple eigenvalues in the graph Laplacian and complete frequency component presence within node features. They create a link between the expressive power of linear SGNNs and the traditional Graph Isomorphism (GI) testing, typically applied to spatial GNNs.

Although this universality is theoretically compelling, the paper concedes that real-world GNNs exhibit different empirical performance despite having the same theoretical expressive power. The optimization of SGNNs is analyzed further by checking the Hessian matrices near their global minimum values. The results validate the use of an orthogonal basis aligned with the graph signal density's weight function for maximizing convergence speed. The introduction of the JacobiConv model, which employs Jacobi polynomial bases to exploit this finding, not only supports orthogonality but also allows a wide variety of weight functions, enhancing adaptability to the intrinsic properties of graph data.

JacobiConv surpasses existing models like GPRGNN, ARMA, ChebyNet, and BernNet in empirical studies. The authors present strong numerical results highlighting JacobiConv's superior performance, with as much as 12% improvement over competitors in certain datasets. The studies affirm that nonlinearity is not a precondition for high expressiveness in spectral GNNs. JacobiConv, incorporating the Polynomial Coefficient Decomposition (PCD) technique, achieves the lowest error rates in approximating filter functions over several synthetic and real-world datasets.

The paper speculates on the potential utility of random features as a means to circumvent the problem of missing frequency components, although this approach raises practical issues related to computational costs, particularly on large graphs. In future developments, scholars may explore combining random features with learnable parameters to balance the theoretical benefits against empirical constraints.

The implications of this paper extend to both practical applications—enabling spectral GNN designs that are both expressive and efficiently trainable—and theoretical explorations, where further research could examine the impact of orthogonal polynomial bases on broader classes of graph data and tasks. JacobiConv offers a promising direction for future GNN architectures, particularly in leveraging spectral properties without resorting to nonlinearity, suggesting optimized routes for GNN design in various graph-based applications.

Github Logo Streamline Icon: https://streamlinehq.com