Abstract |
We consider a communication system, where in addition to the encoder and the decoder, there is a helper that observes non-causally the realization of the noise vector and provides a (lossy) rate-$R_{\mbox{\tiny h}}$ description of it to the encoder. While Lapidoth and Marti (2020) derived coding theorems, associated with achievable channel--coding rates (of the main encoder) for this model, here our focus is on error exponents for continuous-alphabet, additive white Gaussian channels, and in the full version of this paper, we also consider finite-alphabet, modulo-additive channels, with both fixed-rate and variable-rate noise descriptions by the helper. Our main finding is that, as long as the channel-coding rate, $R$, is below the helper-rate, $R_{\mbox{\tiny h}}$, the achievable error exponent is unlimited (i.e., it can be made arbitrarily large), and in some of the cases, it is even strictly infinite (i.e., the error probability can be made strictly zero). However, in the range of coding rates $(R_{\mbox{\tiny h}},R_{\mbox{\tiny h}}+C_0)$, $C_0$ being the ordinary channel capacity (without help), the best achievable error exponent is finite and strictly positive, although there is a certain gap between our upper bound (converse bound) and lower bound (achievability) on the highest achievable error exponent. This means that the model of encoder-assisted communication essentially equivalent to a model, where in addition to the noisy channel between the encoder and decoder, there is also a parallel noiseless bit--pipe of capacity $R_{\mbox{\tiny h}}$. In the full version of the paper, we also extend the scope to the Gaussian multiple access channel (MAC) and characterize the rate sub-region, where the achievable error exponent is unlimited or even infinite.
|