Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On the Low-Temperature MCMC threshold: the cases of sparse tensor PCA, sparse regression, and a geometric rule (2408.00746v2)

Published 1 Aug 2024 in math.ST, cs.DS, math.PR, and stat.TH

Abstract: Over the last years, there has been a significant amount of work studying the power of specific classes of computationally efficient estimators for multiple statistical parametric estimation tasks, including the estimators classes of low-degree polynomials, spectral methods, and others. Despite that, our understanding of the important class of MCMC methods remains quite poorly understood. For instance, for many models of interest, the performance of even zero-temperature (greedy-like) MCMC methods that simply maximize the posterior remains elusive. In this work, we provide an easy to check condition under which the low-temperature Metropolis chain maximizes the posterior in polynomial-time with high probability. The result is generally applicable, and in this work, we use it to derive positive MCMC results for two classical sparse estimation tasks: the sparse tensor PCA model and sparse regression. Interestingly, in both cases, we also leverage the Overlap Gap Property framework for inference (Gamarnik, Zadik AoS '22) to prove that our results are tight: no low-temperature local MCMC method can achieve better performance. In particular, our work identifies the "low-temperature (local) MCMC threshold" for both sparse models. Interestingly, in the sparse tensor PCA model our results indicate that low-temperature local MCMC methods significantly underperform compared to other studied time-efficient methods, such as the class of low-degree polynomials.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com