Information entropy as an anthropomorphic concept
Abstract
According to E.T. Jaynes and E.P. Wigner, entropy is an anthropomorphic concept in the sense that in a physical system correspond many thermodynamic systems. The physical system can be examined from many points of view each time examining different variables and calculating entropy differently. In this paper we discuss how this concept may be applied in information entropy; how Shannon's definition of entropy can fit in Jayne's and Wigner's statement. This is achieved by generalizing Shannon's notion of information entropy and this is the main contribution of the paper. Then we discuss how entropy under these considerations may be used for the comparison of password complexity and as a measure of diversity useful in the analysis of the behavior of genetic algorithms.
- Publication:
-
arXiv e-prints
- Pub Date:
- March 2015
- DOI:
- 10.48550/arXiv.1503.01967
- arXiv:
- arXiv:1503.01967
- Bibcode:
- 2015arXiv150301967R
- Keywords:
-
- Computer Science - Information Theory;
- Computer Science - Artificial Intelligence
- E-Print:
- Improvements in mathematical definitions