Testing Closeness of Multivariate Distributions via Ramsey Theory (2311.13154v1)
Abstract: We investigate the statistical task of closeness (or equivalence) testing for multidimensional distributions. Specifically, given sample access to two unknown distributions $\mathbf p, \mathbf q$ on $\mathbb Rd$, we want to distinguish between the case that $\mathbf p=\mathbf q$ versus $|\mathbf p-\mathbf q|{A_k} > \epsilon$, where $|\mathbf p-\mathbf q|{A_k}$ denotes the generalized ${A}k$ distance between $\mathbf p$ and $\mathbf q$ -- measuring the maximum discrepancy between the distributions over any collection of $k$ disjoint, axis-aligned rectangles. Our main result is the first closeness tester for this problem with {\em sub-learning} sample complexity in any fixed dimension and a nearly-matching sample complexity lower bound. In more detail, we provide a computationally efficient closeness tester with sample complexity $O\left((k{6/7}/ \mathrm{poly}_d(\epsilon)) \logd(k)\right)$. On the lower bound side, we establish a qualitatively matching sample complexity lower bound of $\Omega(k{6/7}/\mathrm{poly}(\epsilon))$, even for $d=2$. These sample complexity bounds are surprising because the sample complexity of the problem in the univariate setting is $\Theta(k{4/5}/\mathrm{poly}(\epsilon))$. This has the interesting consequence that the jump from one to two dimensions leads to a substantial increase in sample complexity, while increases beyond that do not. As a corollary of our general $A_k$ tester, we obtain $d{\mathrm TV}$-closeness testers for pairs of $k$-histograms on $\mathbb Rd$ over a common unknown partition, and pairs of uniform distributions supported on the union of $k$ unknown disjoint axis-aligned rectangles. Both our algorithm and our lower bound make essential use of tools from Ramsey theory.