Going for Speed: Sublinear Algorithms for Dense r-CSPs (1407.7887v1)
Abstract: We give new sublinear and parallel algorithms for the extensively studied problem of approximating n-variable r-CSPs (constraint satisfaction problems with constraints of arity r up to an additive error. The running time of our algorithms is O(n/\epsilon2) + 2O(1/\epsilon2) for Boolean r-CSPs and O(k4 n / \epsilon2) + 2O(log k / \epsilon2) for r-CSPs with constraints on variables over an alphabet of size k. For any constant k this gives optimal dependence on n in the running time unconditionally, while the exponent in the dependence on 1/\epsilon is polynomially close to the lower bound under the exponential-time hypothesis, which is 2\Omega(\epsilon-1/2). For Max-Cut this gives an exponential improvement in dependence on 1/\epsilon compared to the sublinear algorithms of Goldreich, Goldwasser and Ron (JACM'98) and a linear speedup in n compared to the algorithms of Mathieu and Schudy (SODA'08). For the maximization version of k-Correlation Clustering problem our running time is O(k4 n / \epsilon2) + kO(1/\epsilon2), improving the previously best n k{O(1/\epsilon3 log k/\epsilon) by Guruswami and Giotis (SODA'06).