On Finite-Time Mutual Information
Abstract
Shannon-Hartley theorem can accurately calculate the channel capacity when the signal observation time is infinite. However, the calculation of finite-time mutual information, which remains unknown, is essential for guiding the design of practical communication systems. In this paper, we investigate the mutual information between two correlated Gaussian processes within a finite-time observation window. We first derive the finite-time mutual information by providing a limit expression. Then we numerically compute the mutual information within a single finite-time window. We reveal that the number of bits transmitted per second within the finite-time window can exceed the mutual information averaged over the entire time axis, which is called the exceed-average phenomenon. Furthermore, we derive a finite-time mutual information formula under a typical signal autocorrelation case by utilizing the Mercer expansion of trace class operators, and reveal the connection between the finite-time mutual information problem and the operator theory. Finally, we analytically prove the existence of the exceed-average phenomenon in this typical case, and demonstrate its compatibility with the Shannon capacity.
- Publication:
-
arXiv e-prints
- Pub Date:
- April 2022
- DOI:
- arXiv:
- arXiv:2204.11254
- Bibcode:
- 2022arXiv220411254Z
- Keywords:
-
- Computer Science - Information Theory
- E-Print:
- Accepted by IEEE ISIT 2022. In this paper, we establish an analysis framework for continuous finite-time mutual information between Gaussian stochastic processes, laying the foundation for the continuous analysis in electromagnetic information theory. The simulation codes are provided at: http://oa.ee.tsinghua.edu.cn/dailinglong/publications/publications.html. arXiv admin note: substantial text overlap with arXiv:2111.00444