Fixed Error Asymptotics For Erasure and List Decoding
Abstract
We derive the optimum secondorder coding rates, known as secondorder capacities, for erasure and list decoding. For erasure decoding for discrete memoryless channels, we show that secondorder capacity is $\sqrt{V}\Phi^{1}(\epsilon_t)$ where $V$ is the channel dispersion and $\epsilon_t$ is the total error probability, i.e., the sum of the erasure and undetected errors. We show numerically that the expected rate at finite blocklength for erasures decoding can exceed the finite blocklength channel coding rate. We also show that the analogous result also holds for lossless source coding with decoder side information, i.e., SlepianWolf coding. For list decoding, we consider list codes of deterministic size that scales as $\exp(\sqrt{n}l)$ and show that the secondorder capacity is $l+\sqrt{V}\Phi^{1}(\epsilon)$ where $\epsilon$ is the permissible error probability. We also consider lists of polynomial size $n^\alpha$ and derive bounds on the thirdorder coding rate in terms of the order of the polynomial $\alpha$. These bounds are tight for symmetric and singular channels. The direct parts of the coding theorems leverage on the simple threshold decoder and converses are proved using variants of the hypothesis testing converse.
 Publication:

arXiv eprints
 Pub Date:
 February 2014
 arXiv:
 arXiv:1402.4881
 Bibcode:
 2014arXiv1402.4881T
 Keywords:

 Computer Science  Information Theory
 EPrint:
 18 pages, 1 figure