Variability as a better characterization of Shannon entropy
Abstract
The Shannon entropy, one of the cornerstones of information theory, is widely used in physics, particularly in statistical mechanics. Yet its characterization and connection to physics remain vague, leaving ample room for misconceptions and misunderstanding. We will show that the Shannon entropy can be fully understood as measuring the variability of the elements within a given distribution: it characterizes how much variation can be found within a collection of objects. We will see that it is the only indicator that is continuous and linear, that it quantifies the number of yes/no questions (i.e. bits) that are needed to identify an element within the distribution, and we will see how applying this concept to statistical mechanics in different ways leads to the Boltzmann, Gibbs and von Neumann entropies.
 Publication:

European Journal of Physics
 Pub Date:
 July 2021
 DOI:
 10.1088/13616404/abe361
 arXiv:
 arXiv:1912.02012
 Bibcode:
 2021EJPh...42d5102C
 Keywords:

 Shannon entropy;
 statistical mechanics;
 information theory;
 Condensed Matter  Statistical Mechanics;
 Quantum Physics
 EPrint:
 9 pages, no figures. Accepted for publication by European Journal of Physics