MIND: Maximum Mutual Information Based Neural Decoder
Abstract
We are assisting at a growing interest in the development of learning architectures with application to digital communication systems. Herein, we consider the detection/decoding problem. We aim at developing an optimal neural architecture for such a task. The definition of the optimal criterion is a fundamental step. We propose to use the mutual information (MI) of the channel input-output signal pair, which yields to the minimization of the a-posteriori information of the transmitted codeword given the communication channel output observation. The computation of the a-posteriori information is a formidable task, and for the majority of channels it is unknown. Therefore, it has to be learned. For such an objective, we propose a novel neural estimator based on a discriminative formulation. This leads to the derivation of the mutual information neural decoder (MIND). The developed neural architecture is capable not only to solve the decoding problem in unknown channels, but also to return an estimate of the average MI achieved with the coding scheme, as well as the decoding error probability. Several numerical results are reported and compared with maximum a-posteriori and maximum likelihood decoding strategies.
- Publication:
-
arXiv e-prints
- Pub Date:
- May 2022
- DOI:
- 10.48550/arXiv.2205.07061
- arXiv:
- arXiv:2205.07061
- Bibcode:
- 2022arXiv220507061T
- Keywords:
-
- Computer Science - Information Theory;
- Computer Science - Machine Learning
- E-Print:
- 5 pages, 5 figures. This work has been submitted to the IEEE for possible publication. Revisited Lemma 1 and Sec.IV.B