A Sampling Technique of Proving Lower Bounds for Noisy Computations
Abstract
We present a technique of proving lower bounds for noisy computations. This is achieved by a theorem connecting computations on a kind of randomized decision trees and sampling based algorithms. This approach is surprisingly powerful, and applicable to several models of computation previously studied. As a first illustration we show how all the results of Evans and Pippenger (SIAM J. Computing, 1999) for noisy decision trees, some of which were derived using Fourier analysis, follow immediately if we consider the samplingbased algorithms that naturally arise from these decision trees. Next, we show a tight lower bound of $\Omega(N \log\log N)$ on the number of transmissions required to compute several functions (including the parity function and the majority function) in a network of $N$ randomly placed sensors, communicating using local transmissions, and operating with power near the connectivity threshold. This result considerably simplifies and strengthens an earlier result of Dutta, Kanoria Manjunath and Radhakrishnan (SODA 08) that such networks cannot compute the parity function reliably with significantly fewer than $N\log \log N$ transmissions. The lower bound for parity shown earlier made use of special properties of the parity function and is inapplicable, e.g., to the majority function. In this paper, we use our approach to develop an interesting connection between computation of boolean functions on noisy networks that make few transmissionss, and algorithms that work by sampling only a part of the input. It is straightforward to verify that such samplingbased algorithms cannot compute the majority function.
 Publication:

arXiv eprints
 Pub Date:
 March 2015
 DOI:
 10.48550/arXiv.1503.00321
 arXiv:
 arXiv:1503.00321
 Bibcode:
 2015arXiv150300321D
 Keywords:

 Computer Science  Computational Complexity;
 Computer Science  Distributed;
 Parallel;
 and Cluster Computing;
 C.2.1;
 C.2.2;
 C.2.4;
 D.4.4;
 F.1.1;
 F.2.2;
 G.2.2