Spectral Convergence of Sampled Graphs
- Spectral convergence of sampled graphs is the study of how eigenvalues and eigenfunctions of sampled graphs approach those of continuous operators.
- It employs rigorous operator and distance-based criteria, including moment methods and PIM, to quantify convergence rates and relate local structure to global spectral properties.
- Findings underpin practical applications in manifold learning, clustering, and graph signal processing, offering precise convergence metrics for algorithm design.
Spectral convergence of sampled graphs is the paper of how the spectral properties of discrete graphs constructed from samples—be they random samples from manifolds, configuration model realizations, or large finite networks—approximate or approach those of continuous objects (such as graphons, Laplace operators, or infinite graphs) as the sampling size increases. This line of research rigorously connects local and global graph structure, eigenvalue distributions, and the validity of spectral algorithms on large, complex, or random graphs. The following sections provide a comprehensive account of modern perspectives, methodologies, key results, and the implications for theory and applications.
1. Definitions and Notions of Spectral Convergence
The spectral convergence framework generalizes the comparison of graph spectra beyond finite matrices to operator convergence in appropriate function spaces. For a sequence of sampled graphs (constructed from points sampled from a manifold, as a configuration model with given degrees, or randomly according to a graphon or group action), spectral convergence describes the limiting behavior of quantities such as:
- Normalized empirical spectral distributions:
- Spectral measures of graph Laplacians, adjacency matrices, or local operators acting on function spaces
- Eigenvalue and eigenfunction pairs—the convergence of both quantities in appropriate operator-norm, strong, weak, or senses
Different sampling models and graph limits yield distinct, but sometimes overlapping, mathematical structures. These include graphons and graphlets for dense and sparse graphs (Chung, 2012), Benjamini–Schramm (BS) limits and graphonings for sparse and intermediate-density sequences (Bordenave, 11 Oct 2025, Frenkel, 2016), and convergence to Laplace–Beltrami operators in manifold learning contexts (Shi, 2015, Trillos et al., 2018, Calder et al., 2019, Dunson et al., 2019, Peoples et al., 2021).
2. Operator and Distance-based Criteria
Spectral convergence is formalized via the behavior of associated operators:
- For sampled graphs with Laplacians and induced measures , convergence in the spectral -norm involves verifying that form a Cauchy sequence: for all integrable with normalized norms (Chung, 2012).
- Spectral metric equivalence: Convergence under operator norms is shown to be equivalent to convergence under normalized cut/cut-norm or discrepancy distances, which relate spectral information to combinatorial properties such as edge densities between vertex subsets.
- In random geometric graphs, - or -NN-graph connectivity parameters can be tuned (e.g., for manifold dimension ) so that spectral convergence matches pointwise consistency rates (Calder et al., 2019).
- For sparse graphs, Benjamini–Schramm (BS) convergence of the neighborhood distributions ensures that the average spectral measure of local operators (e.g., adjacency, non-backtracking) converges to the measure of the infinite limit object (Bordenave, 11 Oct 2025).
3. Spectral Laws and Limit Objects
The limiting spectral measures that arise depend on the sampling and graph construction method:
- Random Regular Graphs: The limiting spectral law is the Kesten–McKay distribution for fixed degree, and the semicircle law when degree diverges; this is established via Chebyshev polynomials and non-backtracking walk counts (Gong et al., 9 Jun 2024).
- Graphs with Prescribed Degree Sequences: For configuration models with degree sequences satisfying (where ), the eigenvalue distribution of the normalized Laplacian converges to the semicircle law (Wang et al., 3 Dec 2024). Moment analysis and pruning arguments provide necessary and sufficient conditions for such convergence.
- Intermediate-density and Dense Graphs: For sequences with growing degree bounds, graphonings (generalizations of graphons/graphings) capture the limit and encode spectral as well as other graph parameters (Frenkel, 2016).
- Sampling from Manifolds: For randomly sampled points on a manifold, properly constructed graph Laplacians (with bandwidth and normalization choices depending on dimension and sampling density) converge spectrally to the Laplace–Beltrami operator—strongly in operator-norm, , or even senses (Shi, 2015, Trillos et al., 2018, Dunson et al., 2019, Cheng et al., 2021, Peoples et al., 2021).
- Graphon and Graphlet Limits: In dense graph settings or when the sequence exhibits cut-distance convergence, the graphon or graphlet captures the limiting spectrum. For more general classes, (e.g., for nonuniform degree profiles) spectral distances, and their eigenbases, are determined by the limiting operator and its eigenfunctions (Chung, 2012).
4. Methodologies and Quantitative Rates
The spectral convergence program is substantiated by precise analyses:
- The moment method links cycle counts or non-backtracking walks with spectral moments. For example, in regular graphs the -th moment is encoded by the expected number of closed non-backtracking walks of length (Gong et al., 9 Jun 2024).
- The Point Integral Method (PIM) and Dirichlet form comparison provide unifying frameworks for degenerate, weighted, or nonuniform manifold sampling settings, allowing quantitative convergence rates and handling Neumann boundary conditions (Shi, 2015, Trillos et al., 2018).
- Variational min–max principles transfer convergence of operators to convergence of spectral data, supporting proofs even for non-compact settings or graphs with boundary (Peoples et al., 2021).
- Explicit rates: For geometric graphs, under optimal bandwidth scaling, eigenvalue and eigenvector convergence rates (up to logarithmic factors) can be for eigenvalues and for eigenvectors ( intrinsic, ambient dimension) (Calder et al., 2019, Cheng et al., 2021). Uniform (max-norm) convergence rates for eigenfunctions scale as in appropriate settings (Dunson et al., 2019).
- Sampling sets for graph signal recovery are determined using Poincaré-type inequalities for the graphon Laplacian, linking spectral uniqueness sets to functional analytic constraints (Le et al., 2023).
5. Applications and Implications
Spectral convergence theory informs both analysis and algorithm design on large sampled graphs:
- Manifold Learning and Dimensionality Reduction: The theoretical guarantee that spectral embeddings constructed from sampled graphs approximate those of underlying continuous domains (manifolds or graphons) underpins methods such as Laplacian Eigenmaps, Diffusion Maps, and spectral clustering (Singer et al., 2013, Shi, 2015, Wang, 2015).
- Community Detection and Clustering: Rank properties of limiting operators (e.g., presence of multiple nontrivial eigenvalues) signal community structure, enabling consistent spectral clustering algorithms (Chung, 2012, Wang, 2015).
- Random Graph Model Selection: The convergence of empirical spectral densities enables model selection and parameter inference, using divergences in the spectral domain (e.g., distance of kernel-smoothed spectra), agnostic to the graph generation model (Santos et al., 2021).
- Graph Signal Processing: Spectral domain sampling, Laplacian pyramids, and multiscale analysis rely on predictable convergence of frequency components under graph sampling and resizing (Tanaka, 2017).
- Fast Sampling Algorithms: The spectral independence framework, extended beyond bounded degree settings, leverages spectral properties to design fast approximate sampling schemes for models such as the hard-core, matching, and Ising models (Bezáková et al., 2021).
- Generalization and Learning: For invariant graph networks (IGNs), spectral convergence and operator stability inform the consistency of learning algorithms under graphon and stochastic sampling, with strong results for the class “IGN-small” post edge-smoothing (Cai et al., 2022).
6. Extensions, Open Problems, and Limitations
Several open directions and caveats remain:
- Intermediate-density limit objects (“graphonings”) and their completeness as universal limit objects remain unresolved (Frenkel, 2016).
- The role of outliers or extreme eigenvalues in sparse settings is controlled by strong convergence in distribution; in its absence, outlier phenomena can occur (Bordenave, 11 Oct 2025).
- Precise spectral convergence results for non-i.i.d. sampling, edge-weighted and directed graphs, or graphs with complex dependence structures (e.g., heavy-tailed degrees or hypergraphs), are open.
- For signal sampling and recovery in the graphon regime, the identification of necessary and sufficient conditions for uniqueness sets and the extension to dynamic or evolving graphs is ongoing (Le et al., 2023).
- Computational bottlenecks in constructing and diagonalizing large Laplacians or adjacency matrices demand efficient approximations, which are an active area of algorithmic research (Santos et al., 2021).
7. Representative Mathematical Formulations and Tables
A subset of formulations representative of the above developments includes:
| Context | Spectral Convergence Rate | Limiting Distribution |
|---|---|---|
| Random geometric graph (-manifold) | Laplace–Beltrami operator | |
| - and -NN-graphs | (opt. ) | Laplace–Beltrami/weighted |
| Random regular graph, | convergence, any | Semicircle law |
| Configuration model, min | Moments Catalan numbers (semicircle law) | Semicircle law |
| Benjamini–Schramm (BS) convergence | Weak convergence of empirical measures | Spectral measure of limit |
Spectral convergence of sampled graphs thus provides a rigorous bridge between random, geometric, or combinatorial sampling procedures and the asymptotic spectral and functional properties of their continuous, infinite, or limiting analogues. These results underpin the reliability of spectral algorithms, reveal universality phenomena, and guide the design of sampling and inference strategies for large-scale graph-based data analysis.