Fast Learning Requires Good Memory: A TimeSpace Lower Bound for Parity Learning
Abstract
We prove that any algorithm for learning parities requires either a memory of quadratic size or an exponential number of samples. This proves a recent conjecture of Steinhardt, Valiant and Wager and shows that for some learning problems a large storage space is crucial. More formally, in the problem of parity learning, an unknown string $x \in \{0,1\}^n$ was chosen uniformly at random. A learner tries to learn $x$ from a stream of samples $(a_1, b_1), (a_2, b_2) \ldots$, where each~$a_t$ is uniformly distributed over $\{0,1\}^n$ and $b_t$ is the inner product of $a_t$ and $x$, modulo~2. We show that any algorithm for parity learning, that uses less than $\frac{n^2}{25}$ bits of memory, requires an exponential number of samples. Previously, there was no nontrivial lower bound on the number of samples needed, for any learning problem, even if the allowed memory size is $O(n)$ (where $n$ is the space needed to store one sample). We also give an application of our result in the field of boundedstorage cryptography. We show an encryption scheme that requires a private key of length $n$, as well as time complexity of $n$ per encryption/decription of each bit, and is provenly and unconditionally secure as long as the attacker uses less than $\frac{n^2}{25}$ memory bits and the scheme is used at most an exponential number of times. Previous works on boundedstorage cryptography assumed that the memory size used by the attacker is at most linear in the time needed for encryption/decription.
 Publication:

arXiv eprints
 Pub Date:
 February 2016
 DOI:
 10.48550/arXiv.1602.05161
 arXiv:
 arXiv:1602.05161
 Bibcode:
 2016arXiv160205161R
 Keywords:

 Computer Science  Machine Learning;
 Computer Science  Computational Complexity;
 Computer Science  Cryptography and Security