A Matrix Chernoff Bound for Markov Chains and Its Application to Cooccurrence Matrices
Abstract
We prove a Chernofftype bound for sums of matrixvalued random variables sampled via a regular (aperiodic and irreducible) finite Markov chain. Specially, consider a random walk on a regular Markov chain and a Hermitian matrixvalued function on its state space. Our result gives exponentially decreasing bounds on the tail distributions of the extreme eigenvalues of the sample mean matrix. Our proof is based on the matrix expander (regular undirected graph) Chernoff bound [Garg et al. STOC '18] and scalar ChernoffHoeffding bounds for Markov chains [Chung et al. STACS '12]. Our matrix Chernoff bound for Markov chains can be applied to analyze the behavior of cooccurrence statistics for sequential data, which have been common and important data signals in machine learning. We show that given a regular Markov chain with $n$ states and mixing time $\tau$, we need a trajectory of length $O(\tau (\log{(n)}+\log{(\tau)})/\epsilon^2)$ to achieve an estimator of the cooccurrence matrix with error bound $\epsilon$. We conduct several experiments and the experimental results are consistent with the exponentially fast convergence rate from theoretical analysis. Our result gives the first bound on the convergence rate of the cooccurrence matrix and the first sample complexity analysis in graph representation learning.
 Publication:

arXiv eprints
 Pub Date:
 August 2020
 arXiv:
 arXiv:2008.02464
 Bibcode:
 2020arXiv200802464Q
 Keywords:

 Statistics  Machine Learning;
 Computer Science  Machine Learning;
 Mathematics  Probability
 EPrint:
 Accepted at NeurIPS'20, 25 pages