Quantum Probability as an Application of Data Compression Principles
Abstract
Realist, nocollapse interpretations of quantum mechanics, such as Everett's, face the probability problem: how to justify the normsquared (Born) rule from the wavefunction alone. While any basisindependent measure can only be normsquared (due to the GleasonBusch Theorem) this fact conflicts with various popular, nonwavefunctionbased phenomenological measures  such as observer, outcome or world counting  that are frequently demanded of Everettians. These alternatives conflict, however, with the wavefunction realism upon which Everett's approach rests, which seems to call for an objective, basisindependent measure based only on wavefunction amplitudes. The ability of quantum probabilities to destructively interfere with each other, however, makes it difficult to see how probabilities can be derived solely from amplitudes in an intuitively appealing way. I argue that the use of algorithmic probability can solve this problem, since the objective, singlecase probability measure that wavefunction realism demands is exactly what algorithmic information theory was designed to provide. The result is an intuitive account of complexvalued amplitudes, as coefficients in an optimal lossy data compression, such that changes in algorithmic information content (entropy deltas) are associated with phenomenal transitions.
 Publication:

arXiv eprints
 Pub Date:
 June 2016
 DOI:
 10.48550/arXiv.1606.06802
 arXiv:
 arXiv:1606.06802
 Bibcode:
 2016arXiv160606802R
 Keywords:

 Computer Science  Information Theory;
 Quantum Physics
 EPrint:
 In Proceedings PC 2016, arXiv:1606.06513