Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
126 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Achieving the Bayes Error Rate in Synchronization and Block Models by SDP, Robustly (1904.09635v1)

Published 21 Apr 2019 in math.ST, cs.IT, cs.LG, math.IT, math.OC, stat.ML, and stat.TH

Abstract: We study the statistical performance of semidefinite programming (SDP) relaxations for clustering under random graph models. Under the $\mathbb{Z}_{2}$ Synchronization model, Censored Block Model and Stochastic Block Model, we show that SDP achieves an error rate of the form [ \exp\Big[-\big(1-o(1)\big)\bar{n} I* \Big]. ] Here $\bar{n}$ is an appropriate multiple of the number of nodes and $I*$ is an information-theoretic measure of the signal-to-noise ratio. We provide matching lower bounds on the Bayes error for each model and therefore demonstrate that the SDP approach is Bayes optimal. As a corollary, our results imply that SDP achieves the optimal exact recovery threshold under each model. Furthermore, we show that SDP is robust: the above bound remains valid under semirandom versions of the models in which the observed graph is modified by a monotone adversary. Our proof is based on a novel primal-dual analysis of SDP under a unified framework for all three models, and the analysis shows that SDP tightly approximates a joint majority voting procedure.

Citations (22)

Summary

We haven't generated a summary for this paper yet.