Quantum machine learning beyond kernel methods
Abstract
Machine learning algorithms based on parametrized quantum circuits are a prime candidate for nearterm applications on noisy quantum computers. Yet, our understanding of how these quantum machine learning models compare, both mutually and to classical models, remains limited. Previous works achieved important steps in this direction by showing a close connection between some of these quantum models and kernel methods, wellstudied in classical machine learning. In this work, we identify the first unifying framework that captures all standard models based on parametrized quantum circuits: that of linear quantum models. In particular, we show how data reuploading circuits, a generalization of linear models, can be efficiently mapped into equivalent linear quantum models. Going further, we also consider the experimentallyrelevant resource requirements of these models in terms of qubit number and datasample efficiency, i.e., amount of data needed to learn. We establish learning separations demonstrating that linear quantum models must utilize exponentially more qubits than data reuploading models in order to solve certain learning tasks, while kernel methods additionally require exponentially many more data points. Our results constitute significant strides towards a more comprehensive theory of quantum machine learning models as well as provide guidelines on which models may be better suited from experimental perspectives.
 Publication:

arXiv eprints
 Pub Date:
 October 2021
 arXiv:
 arXiv:2110.13162
 Bibcode:
 2021arXiv211013162J
 Keywords:

 Quantum Physics;
 Computer Science  Artificial Intelligence;
 Computer Science  Machine Learning;
 Statistics  Machine Learning
 EPrint:
 16 pages, 13 figures