Balancing Gaussian vectors in high dimension
Abstract
Motivated by problems in controlled experiments, we study the discrepancy of random matrices with continuous entries where the number of columns $n$ is much larger than the number of rows $m$. Our first result shows that if $\omega(1) \leq m \leq o(n)$, a matrix with i.i.d. standard Gaussian entries has discrepancy $\Theta(\sqrt{n} \, 2^{n/m})$ with high probability. This provides sharp guarantees for Gaussian discrepancy in a regime that had not been considered before in the existing literature. Our results also apply to a more general family of random matrices with continuous i.i.d. entries, assuming that $m \leq O(n/\log{n})$. The proof is nonconstructive and is an application of the second moment method. Our second result is algorithmic and applies to random matrices whose entries are i.i.d. and have a Lipschitz density. We present a randomized polynomialtime algorithm that achieves discrepancy $e^{\Omega(\log^2(n)/m)}$ with high probability, provided that $m \leq O(\sqrt{\log{n}})$. In the onedimensional case, this matches the best known algorithmic guarantees due to KarmarkarKarp. For higher dimensions $2 \leq m \leq O(\sqrt{\log{n}})$, this establishes the first efficient algorithm achieving discrepancy smaller than $O( \sqrt{m} )$.
 Publication:

arXiv eprints
 Pub Date:
 October 2019
 arXiv:
 arXiv:1910.13972
 Bibcode:
 2019arXiv191013972M
 Keywords:

 Computer Science  Discrete Mathematics;
 Mathematics  Statistics Theory;
 68R01;
 62F12