Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

SQ Lower Bounds for Learning Bounded Covariance GMMs (2306.13057v1)

Published 22 Jun 2023 in cs.LG, cs.DS, math.ST, stat.ML, and stat.TH

Abstract: We study the complexity of learning mixtures of separated Gaussians with common unknown bounded covariance matrix. Specifically, we focus on learning Gaussian mixture models (GMMs) on $\mathbb{R}d$ of the form $P= \sum_{i=1}k w_i \mathcal{N}(\boldsymbol \mu_i,\mathbf \Sigma_i)$, where $\mathbf \Sigma_i = \mathbf \Sigma \preceq \mathbf I$ and $\min_{i \neq j} | \boldsymbol \mu_i - \boldsymbol \mu_j|_2 \geq k\epsilon$ for some $\epsilon>0$. Known learning algorithms for this family of GMMs have complexity $(dk){O(1/\epsilon)}$. In this work, we prove that any Statistical Query (SQ) algorithm for this problem requires complexity at least $d{\Omega(1/\epsilon)}$. In the special case where the separation is on the order of $k{1/2}$, we additionally obtain fine-grained SQ lower bounds with the correct exponent. Our SQ lower bounds imply similar lower bounds for low-degree polynomial tests. Conceptually, our results provide evidence that known algorithms for this problem are nearly best possible.

Summary

We haven't generated a summary for this paper yet.