Paper ID | D6-S5-T2.2 |
Paper Title |
Quantization of Random Distributions under KL Divergence |
Authors |
Aviv Adler, Jennifer Tang, Yury Polyanskiy, Massachusetts Institute of Technology, United States |
Session |
D6-S5-T2: Rate-Distortion Theory II |
Chaired Session: |
Monday, 19 July, 23:20 - 23:40 |
Engagement Session: |
Monday, 19 July, 23:40 - 00:00 |
Abstract |
Consider the problem of representing a distribution pi on a large alphabet of size k up to fidelity epsilon in Kullback-Leibler (KL) divergence. Heuristically, arguing as for quadratic loss in high dimension, one expects that about (k/ 2) log (1/ epsilon) bits would be required. We show this intuition is correct by proving explicit non-asymptotic bounds for the minimal average distortion when pi is randomly sampled from a symmetric Dirichlet prior on the simplex. Our method is to reduce the single-sample problem to the traditional setting of iid samples, but for a non standard rate distortion question with the novel distortion measure d(x,y) = x log(x/y), which we call divergence distortion. Practically, our results advocate using a x maps to x^{2/3} compander (for small x) followed by a uniform scalar quantizer for storing large-alphabet distributions.
|