Note: This content is accessible to all versions of every browser. However, this browser does not seem to support current Web standards, preventing the display of our site's design details.


Capacity of Random Channels with Large Alphabets


T. Sutter, D. Sutter, J. Lygeros

Advances in Mathematics of Communications, vol. 11, no. 4, pp. 813 - 835, (arXiv 1503.04108) [OC:03751]

We consider discrete memoryless channels with input alphabet size n and output alphabet size m, where m=ceil(\gamma n) for some constant \gamma>0. The channel transition matrix consists of entries that, before being normalised, are independent and identically distributed nonnegative random variables V and such that E[(V log V)^2]<\infty. We prove that in the limit as n\to \infty the capacity of such a channel converges to Ent(V) / E[V] almost surely and in L^2, where Ent(V):= e[V log V]-E[V]E[log V] denotes the entropy of V. We further show that the capacity of these random channels converges to this asymptotic value exponentially in n. Finally, we present an application in the context of Bayesian optimal experiment design.

Further Information

Type of Publication:


File Download:

Request a copy of this publication.
(Uses JavaScript)
% Autogenerated BibTeX entry
@Article { SutSut:2017:IFA_5121,
    author={T. Sutter and D. Sutter and J. Lygeros},
    title={{Capacity of Random Channels with Large Alphabets}},
    journal={Advances in Mathematics of Communications},
    pages={813 -- 835},
Permanent link