Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Bayesian estimation from few samples: community detection and related problems (1710.00264v1)

Published 30 Sep 2017 in cs.DS, cs.CC, cs.LG, math.PR, and stat.ML

Abstract: We propose an efficient meta-algorithm for Bayesian estimation problems that is based on low-degree polynomials, semidefinite programming, and tensor decomposition. The algorithm is inspired by recent lower bound constructions for sum-of-squares and related to the method of moments. Our focus is on sample complexity bounds that are as tight as possible (up to additive lower-order terms) and often achieve statistical thresholds or conjectured computational thresholds. Our algorithm recovers the best known bounds for community detection in the sparse stochastic block model, a widely-studied class of estimation problems for community detection in graphs. We obtain the first recovery guarantees for the mixed-membership stochastic block model (Airoldi et el.) in constant average degree graphs---up to what we conjecture to be the computational threshold for this model. We show that our algorithm exhibits a sharp computational threshold for the stochastic block model with multiple communities beyond the Kesten--Stigum bound---giving evidence that this task may require exponential time. The basic strategy of our algorithm is strikingly simple: we compute the best-possible low-degree approximation for the moments of the posterior distribution of the parameters and use a robust tensor decomposition algorithm to recover the parameters from these approximate posterior moments.

Citations (61)

Summary

We haven't generated a summary for this paper yet.