Scalable quantum detector tomography by highperformance computing
Abstract
At large scales, quantum systems may become advantageous over their classical counterparts at performing certain tasks. Developing tools to analyse these systems at the relevant scales, in a manner consistent with quantum mechanics, is therefore critical to benchmarking performance and characterising their operation. While classical computational approaches cannot perform likeforlike computations of quantum systems beyond a certain scale, classical highperformance computing (HPC) may nevertheless be useful for precisely these characterisation and certification tasks. By developing opensource customised algorithms using highperformance computing, we perform quantum tomography on a megascale quantum photonic detector covering a Hilbert space of $10^6$. This requires finding $10^8$ elements of the matrix corresponding to the positive operator valued measure (POVM), the quantum description of the detector, and is achieved in minutes of computation time. Moreover, by exploiting the structure of the problem, we achieve highly efficient parallel scaling, paving the way for quantum objects up to a system size of $10^{12}$ elements to be reconstructed using this method. In general, this shows that a consistent quantum mechanical description of quantum phenomena is applicable at everyday scales. More concretely, this enables the reconstruction of largescale quantum sources, processes and detectors used in computation and sampling tasks, which may be necessary to prove their nonclassical character or quantum computational advantage.
 Publication:

arXiv eprints
 Pub Date:
 April 2024
 DOI:
 10.48550/arXiv.2404.02844
 arXiv:
 arXiv:2404.02844
 Bibcode:
 2024arXiv240402844S
 Keywords:

 Quantum Physics;
 Computer Science  Distributed;
 Parallel;
 and Cluster Computing
 EPrint:
 31 pages, 8 figures