Random Quadratic Forms with Dependence: Applications to Restricted Isometry and Beyond
Abstract
Several important families of computational and statistical results in machine learning and randomized algorithms rely on uniform bounds on quadratic forms of random vectors or matrices. Such results include the JohnsonLindenstrauss (JL) Lemma, the Restricted Isometry Property (RIP), randomized sketching algorithms, and approximate linear algebra. The existing results critically depend on statistical independence, e.g., independent entries for random vectors, independent rows for random matrices, etc., which prevent their usage in dependent or adaptive modeling settings. In this paper, we show that such independence is in fact not needed for such results which continue to hold under fairly general dependence structures. In particular, we present uniform bounds on random quadratic forms of stochastic processes which are conditionally independent and subGaussian given another (latent) process. Our setup allows general dependencies of the stochastic process on the history of the latent process and the latent process to be influenced by realizations of the stochastic process. The results are thus applicable to adaptive modeling settings and also allows for sequential design of random vectors and matrices. We also discuss stochastic process based forms of JL, RIP, and sketching, to illustrate the generality of the results.
 Publication:

arXiv eprints
 Pub Date:
 October 2019
 DOI:
 10.48550/arXiv.1910.04930
 arXiv:
 arXiv:1910.04930
 Bibcode:
 2019arXiv191004930B
 Keywords:

 Computer Science  Machine Learning;
 Mathematics  Statistics Theory;
 Statistics  Machine Learning