Graphon-Level Bayesian Predictive Synthesis
- The paper introduces a graphon-level Bayesian predictive synthesis method that optimally aggregates network models through an L2 projection, achieving minimax optimality and oracle inequalities.
- It employs a least-squares projection to combine multiple agent graphon models, transferring estimation error explicitly to network properties like edge and triangle densities.
- The approach demonstrates a 'combination beats components' phenomenon while preserving key characteristics in heavy-tailed degree distributions and ERGM settings.
Graphon-level Bayesian Predictive Synthesis (BPS) formalizes the combination of predictive distributions from multiple agents at the level of random graph limit objects known as graphons, offering a decision-theoretic method for aggregating network models. Framed as an projection, graphon-level BPS achieves minimax optimality, oracle inequalities, and precise transfer of estimation error to network structural properties while also encompassing behaviors relevant for heavy-tailed degree distributions and exponential random graph model (ERGM) families (Papamichalis et al., 21 Dec 2025).
1. Foundational Framework: Graphons, Agent Models, and Synthesis
A graphon is a symmetric, measurable function defining an infinite exchangeable random network via the Aldous–Hoover representation: Unif, edges included independently with probability . Each agent model , , specifies an alternative random graph law converging (in cut-distance) to as .
The synthesis objective is to construct a single synthesized graphon in the affine span of agents,
with , optimal in distance to the unknown true graphon :
where and defining .
2. Least-Squares Projection and Synthesis Algorithm
Define the -dimensional feature vector
Let Unif; then the population Gram matrix and vector are
The risk of a linear combination is
The unique risk minimizer is given by:
yielding the synthesized graphon as an orthogonal projection of onto . In empirical settings, sampled edges yield empirical Gram matrices and vectors for finite-sample least squares estimation:
where Unif, .
3. Nonasymptotic Guarantees and Minimax Optimality
Sampling i.i.d. edges, the empirical least-squares estimator achieves the following oracle inequality:
where and is a constant under mild conditions. This separates intrinsic approximation error from the sample-size-dependent estimation term.
For in a ball of radius in , the minimax rate holds:
showing minimax-rate optimality for least-squares BPS.
4. Combination-Beats-Components Phenomenon
For lying in the convex hull , any estimator that selects a single (single-agent selection) incurs a uniform L error lower-bounded by some , independent of , and thus does not converge. In contrast, least-squares graphon-BPS achieves error , strictly outperforming all individual agent models. This result formalizes a combination beats components effect intrinsic to the BPS framework.
5. Lipschitz Transfer of Graphon Error to Network Properties
Key graphon functionals include:
- Edge density:
- Degree:
- Triangle density:
- Wedge density:
- Clustering: (if )
The -Lipschitz theorem provides for any two graphons :
- For ,
A direct corollary is that -risk control at the graphon level yields explicit, quantitative control of errors in network summaries such as edge density, degree distributions, clustering coefficients, and phase transition points for giant components.
6. Heavy-Tailed Degree Distributions and Entropic Tilting
Suppose some agents in the mixture possess heavy-tailed degree distributions with . If the BPS mixture assigns weight to at least one of minimal value, then the combined degree distribution follows
i.e., the slowest-decaying power law dominates.
For a degree-tilted edge law :
- If is slowly varying (index $0$), the tail exponent is preserved.
- If , the tail exponent shifts to .
- For polynomially controlled tilts, the degree tail is sandwiched between and .
- Entropic tilts using bounded graph statistics via leave the exponent unchanged: .
7. Closure Under Log-Linear Pooling: Exponential Random Graph Models
If agent models are ERGMs with sufficient statistics and natural parameters , a log-linear BPS pool
preserves the ERGM form, now acting on the stacked statistic with new parameter blocks . This ensures compatibility and representational coherence within the log-linear Bayesian pooling inventory.
Summary Table: Synthesis Properties and Guarantees
| Property | BPS Result | Comparative Statement |
|---|---|---|
| Risk minimization | projection, least-squares solution | Minimax optimal in class |
| Oracle inequality | Yes: separation of approximation and estimation error | Matches parametric rate |
| Combination vs. components | Combination strictly beats any single component | Single-agent is inconsistent |
| Structural error transfer | Explicit Lipschitz bounds linking to error in network summaries | Not available for selectors |
| Heavy-tail preservation | Mixture tail matches slowest-decaying agent | Dominance vs. components |
| Closure for ERGMs under pooling | Log-linear pooling preserves ERGM form | Ensures model class integrity |
References
All content is summarized from "Graphon-Level Bayesian Predictive Synthesis for Random Network" (Papamichalis et al., 21 Dec 2025).