An improved quantuminspired algorithm for linear regression
Abstract
We give a classical algorithm for linear regression analogous to the quantum matrix inversion algorithm [Harrow, Hassidim, and Lloyd, Physical Review Letters'09] for lowrank matrices [Wossnig et al., Physical Review Letters'18], when the input matrix $A$ is stored in a data structure applicable for QRAMbased state preparation. Namely, given an $A \in \mathbb{C}^{m\times n}$ with minimum singular value $\sigma$ and which supports certain efficient $\ell_2$norm importance sampling queries, along with a $b \in \mathbb{C}^m$, we can output a description of an $x \in \mathbb{C}^n$ such that $\x  A^+b\ \leq \varepsilon\A^+b\$ in $\tilde{\mathcal{O}}\Big(\frac{\A\_{\mathrm{F}}^6\A\^2}{\sigma^8\varepsilon^4}\Big)$ time, improving on previous "quantuminspired" algorithms in this line of research by a factor of $\frac{\A\^{14}}{\sigma^{14}\varepsilon^2}$ [Chia et al., STOC'20]. The algorithm is stochastic gradient descent, and the analysis bears similarities to those of optimization algorithms for regression in the usual setting [Gupta and Sidford, NeurIPS'18]. Unlike earlier works, this is a promising avenue that could lead to feasible implementations of classical regression in a quantuminspired setting, for comparison against future quantum computers.
 Publication:

arXiv eprints
 Pub Date:
 September 2020
 arXiv:
 arXiv:2009.07268
 Bibcode:
 2020arXiv200907268G
 Keywords:

 Computer Science  Data Structures and Algorithms;
 Quantum Physics
 EPrint:
 16 pages, bug fixed