Going for Speed: Sublinear Algorithms for Dense rCSPs
Abstract
We give new sublinear and parallel algorithms for the extensively studied problem of approximating nvariable rCSPs (constraint satisfaction problems with constraints of arity r up to an additive error. The running time of our algorithms is O(n/\epsilon^2) + 2^O(1/\epsilon^2) for Boolean rCSPs and O(k^4 n / \epsilon^2) + 2^O(log k / \epsilon^2) for rCSPs with constraints on variables over an alphabet of size k. For any constant k this gives optimal dependence on n in the running time unconditionally, while the exponent in the dependence on 1/\epsilon is polynomially close to the lower bound under the exponentialtime hypothesis, which is 2^\Omega(\epsilon^(1/2)). For MaxCut this gives an exponential improvement in dependence on 1/\epsilon compared to the sublinear algorithms of Goldreich, Goldwasser and Ron (JACM'98) and a linear speedup in n compared to the algorithms of Mathieu and Schudy (SODA'08). For the maximization version of kCorrelation Clustering problem our running time is O(k^4 n / \epsilon^2) + k^O(1/\epsilon^2), improving the previously best n k^{O(1/\epsilon^3 log k/\epsilon) by Guruswami and Giotis (SODA'06).
 Publication:

arXiv eprints
 Pub Date:
 July 2014
 arXiv:
 arXiv:1407.7887
 Bibcode:
 2014arXiv1407.7887Y
 Keywords:

 Computer Science  Data Structures and Algorithms