Temporal Information Processing on Noisy Quantum Computers
Abstract
The combination of machine learning and quantum computing has emerged as a promising approach for addressing previously untenable problems. Reservoir computing is an efficient learning paradigm that utilizes nonlinear dynamical systems for temporal information processing, i.e., processing of input sequences to produce output sequences. Here we propose quantum reservoir computing that harnesses complex dissipative quantum dynamics. Our class of quantum reservoirs is universal, in that any nonlinear fading memory map can be approximated arbitrarily closely and uniformly over all inputs by a quantum reservoir from this class. We describe a subclass of the universal class that is readily implementable using quantum gates native to current noisy gate-model quantum computers. Proof-of-principle experiments on remotely accessed cloud-based superconducting quantum computers demonstrate that small and noisy quantum reservoirs can tackle high-order nonlinear temporal tasks. Our theoretical and experimental results pave the path for attractive temporal processing applications of near-term gate-model quantum computers of increasing fidelity but without quantum error correction, signifying the potential of these devices for wider applications including neural modeling, speech recognition, and natural language processing, going beyond static classification and regression tasks.
- Publication:
-
Physical Review Applied
- Pub Date:
- August 2020
- DOI:
- 10.1103/PhysRevApplied.14.024065
- arXiv:
- arXiv:2001.09498
- Bibcode:
- 2020PhRvP..14b4065C
- Keywords:
-
- Quantum Physics;
- Electrical Engineering and Systems Science - Systems and Control;
- Statistics - Machine Learning
- E-Print:
- 9 pages main text, 14 pages appendices, 13 figures. Added implementation scheme using QND measurements and proposal of more efficient implementation schemes without and with QND measurements. To appear in Physical Review Applied