Papers
Topics
Authors
Recent
Search
2000 character limit reached

Symmetric Random Scan Gibbs Sampler

Updated 17 January 2026
  • The symmetric random scan Gibbs sampler is an MCMC method that updates one coordinate chosen uniformly at random according to its full conditional distribution.
  • It demonstrates analytical reversibility and robust mixing properties, with spectral gap solidarity ensuring convergence rates comparable to systematic scans.
  • Practical implementations reveal scalability in high-dimensional Bayesian models through efficient variable selection and optimized dynamic scan orders.

The symmetric random scan Gibbs sampler is a Markov chain Monte Carlo (MCMC) algorithm designed for sampling from multivariate distributions defined on product spaces. It operates by selecting coordinates uniformly at random and updating only the chosen coordinate according to its full conditional distribution, leaving all other coordinates fixed. This scan strategy, in contrast to systematic (cyclic) scans, offers robustness and favorable theoretical properties with respect to spectral gap and mixing time. The symmetric random scan variant is especially notable for its analytical reversibility and for exhibiting "solidarity" of spectral gap with deterministic scans—if any scan type has a positive spectral gap, so do all others (Chlebicka et al., 2023).

1. Mathematical Formulation and Operator Perspective

Given a target probability distribution π\pi defined on X=X1×⋯×Xn\mathcal X = \mathcal X_1 \times \cdots \times \mathcal X_n, the symmetric random scan Gibbs sampler utilizes orthogonal projections in the Hilbert space L2(π)L^2(\pi). Define, for each coordinate ii, the projection operator: Pif(x)=EY∼π[f(Y)∣Y−i=x−i].\mathsf P_i f(\mathbf x) = \mathbb E_{Y \sim \pi}\bigl[f(Y)\mid Y_{-i} = \mathbf x_{-i}\bigr]. Each Pi\mathsf P_i is an orthogonal projection onto the subspace of functions constant in the ii-th coordinate. The transition operator for the symmetric random scan (sometimes called symmetric Glauber dynamics) is

PGD=1n∑i=1nPi,\mathsf P_{\mathrm{GD}} = \frac{1}{n} \sum_{i=1}^n \mathsf P_i,

which is self-adjoint and reversible with respect to π\pi (Gaitonde et al., 2024).

2. Spectral Gap Solidarity and Geometric Interpretation

A key theoretical advance is the "solidarity" property of spectral gaps: if any random scan or deterministic (cyclic) scan Gibbs sampler has a positive spectral gap in L2(Ï€)L^2(\pi), then all do (Chlebicka et al., 2023). The spectral gap of the symmetric random scan is measured as

X=X1×⋯×Xn\mathcal X = \mathcal X_1 \times \cdots \times \mathcal X_n0

where X=X1×⋯×Xn\mathcal X = \mathcal X_1 \times \cdots \times \mathcal X_n1 is the projection onto the constant functions.

This equivalence arises via geometric analysis of the spectral contraction induced by alternating projection algorithms (von Neumann–Halperin cyclic projections) and the quantification of the generalized Friedrichs angle X=X1×⋯×Xn\mathcal X = \mathcal X_1 \times \cdots \times \mathcal X_n2 for the subspaces X=X1×⋯×Xn\mathcal X = \mathcal X_1 \times \cdots \times \mathcal X_n3 associated with each coordinate. Convergence rates are determined by operator norms and inclination parameters X=X1×⋯×Xn\mathcal X = \mathcal X_1 \times \cdots \times \mathcal X_n4 characterizing how singular or correlated the conditionals are (Chlebicka et al., 2023).

3. Comparative Mixing Time Analysis

Mixing times, quantified in total variation distance, are polynomially related between symmetric random scan and systematic scans. The sharp bound for the spectral gap of a full systematic cycle is

X=X1×⋯×Xn\mathcal X = \mathcal X_1 \times \cdots \times \mathcal X_n5

for any permutation X=X1×⋯×Xn\mathcal X = \mathcal X_1 \times \cdots \times \mathcal X_n6 of the coordinates. Mixing times are connected via

X=X1×⋯×Xn\mathcal X = \mathcal X_1 \times \cdots \times \mathcal X_n7

Conversely, a fast-mixing scan order implies the symmetric random scan mixes within a polynomial factor loss (Gaitonde et al., 2024). The order of the polynomial factor (up to X=X1×⋯×Xn\mathcal X = \mathcal X_1 \times \cdots \times \mathcal X_n8) is proven sharp using hard-core examples on the complete graph from Roberts–Rosenthal.

Counterexamples indicate both scan strategies can be polynomially slower or faster than the other depending on state-space structure or scan order, disproving folklore conjectures of only constant or logarithmic separation (He et al., 2016).

4. Hierarchical Structure and Spectral Telescope Bounds

The symmetric random scan Gibbs sampler admits a hierarchical structure where updating X=X1×⋯×Xn\mathcal X = \mathcal X_1 \times \cdots \times \mathcal X_n9 coordinates can be recursively interpreted as nesting conditional samplers on lower-dimensional subspaces. The "spectral telescope" formalism yields product-form lower bounds on the spectral gap: L2(π)L^2(\pi)0 Further lower bounds utilize correlation structure, comparison to random walks, and spectral-independence (influence matrix) via Wasserstein contraction metrics, achieving tight order bounds in canonical examples (e.g., uniform distribution on the simplex corner) (Qin et al., 2022).

5. Convergence Guarantees and Ergodicity

Under Poincaré or log-Sobolev inequalities for the target, combined with regularity assumptions (e.g., TV-continuity of conditionals), symmetric random scan Gibbs exhibits polynomial (in L2(π)L^2(\pi)1) mixing time: L2(π)L^2(\pi)2 where L2(π)L^2(\pi)3 governs conditional regularity (Goyal et al., 27 Jun 2025). In two-coordinate scenarios, the symmetric random scan achieves geometric ergodicity if sufficient Lyapunov drift conditions hold, with explicit drift constants relating to birth–death chain parameters (Tan et al., 2012). The spectral gap and long-run central limit variances can be explicitly computed through two-projection theory (Qin, 2022, Qin et al., 2020).

6. Practical Implementation and Variants

Recent scalable implementations leverage symmetric random scan for variable selection in high-dimensional Bayesian models, with data-informed proposal weights concentrating on likely signals and a uniform component maintaining irreducibility. Storage and computational costs can be reduced to L2(Ï€)L^2(\pi)4 and L2(Ï€)L^2(\pi)5 per iteration (where L2(Ï€)L^2(\pi)6 is model size), allowing exact posterior sampling at scale (Chung, 10 Jan 2026). Detailed balance and aperiodicity are immediate under uniform randomness and positive proposal weights.

Advanced scheduling and scan optimization, such as Dobrushin-optimized Gibbs sampling (DoGS), dynamically tailor scan orders to minimize explicit total-variation bounds, guaranteeing strictly improved finite-L2(Ï€)L^2(\pi)7 accuracy relative to unoptimized symmetric scans (Mitliagkas et al., 2017).

7. Methodological and Algorithmic Implications

Practitioners can reliably select symmetric random scan for robustness against poor scan order choice, assured by theoretical solidarity: if any scan order mixes quickly, symmetric random scan will also mix within polynomial bounds. Conversely, systematic scans may be preferable where locality or hardware factors dominate, though their worst-case mixing can be polynomially worse (Gaitonde et al., 2024, He et al., 2016).

The geometric perspective—alternating projections on the Hilbert space of square-integrable functions—provides foundational operator-theoretic tools for analyzing scan order effects and designing new variants with provable mixing guarantees.


References:

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Symmetric Random Scan Gibbs Sampler.