LinearComplexity BlackBox Randomized Compression of RankStructured Matrices
Abstract
A randomized algorithm for computing a compressed representation of a given rankstructured matrix $A \in \mathbb{R}^{N\times N}$ is presented. The algorithm interacts with $A$ only through its action on vectors. Specifically, it draws two tall thin matrices $\Omega,\,\Psi \in \mathbb{R}^{N\times s}$ from a suitable distribution, and then reconstructs $A$ from the information contained in the set $\{A\Omega,\,\Omega,\,A^{*}\Psi,\,\Psi\}$. For the specific case of a "Hierarchically Block Separable (HBS)" matrix (a.k.a.~Hierarchically SemiSeparable matrix) of block rank $k$, the number of samples $s$ required satisfies $s = O(k)$, with $s \approx 3k$ being representative. While a number of randomized algorithms for compressing rankstructured matrices have previously been published, the current algorithm appears to be the first that is both of truly linear complexity (no $N\log(N)$ factors in the complexity bound) and fully "black box" in the sense that no matrix entry evaluation is required. Further, all samples can be extracted in parallel, enabling the algorithm to work in a "streaming" or "single view" mode.
 Publication:

arXiv eprints
 Pub Date:
 May 2022
 DOI:
 10.48550/arXiv.2205.02990
 arXiv:
 arXiv:2205.02990
 Bibcode:
 2022arXiv220502990L
 Keywords:

 Mathematics  Numerical Analysis