Abstract |
We study an information analogue of infinitely divisible probability distributions, where the sum is replaced by the joint distribution of an i.i.d. sequence. A random variable $X$ is called informationally infinitely divisible if, for any $n\ge1$, there exists an i.i.d. sequence $Z_{1},\ldots,Z_{n}$ containing the same information as $X$, i.e., there exists an injective function $f$ such that $X=f(Z_{1},\ldots,Z_{n})$. While there does not exist such informationally infinitely divisible discrete random variable, we show that any discrete random variable $X$ can be divided into arbitrarily many identical pieces with a multiplicative penalty to the entropy, that is, if we remove the injectivity requirement on $f$, then there exists i.i.d. $Z_{1},\ldots,Z_{n}$ and $f$ satisfying $X=f(Z_{1},\ldots,Z_{n})$ and $H(X)/n\le H(Z_{1})\le1.59H(X)/n+2.43$. Applications include independent component analysis, distributed storage with a secrecy constraint, and distributed random number generation.
|