|| Individually Conditional Individual Mutual Information Bound on Generalization Error
||Ruida Zhou, Chao Tian, Tie Liu, Electrical and Computer Engineering, United States|
||D2-S3-T3: IT Bounds on Generalization Error
||Tuesday, 13 July, 22:40 - 23:00
||Tuesday, 13 July, 23:00 - 23:20
We propose a new information-theoretic bound on generalization error based on a combination of the error decomposition technique of Bu et al. and the conditional mutual information (CMI) construction of Steinke and Zakynthinou. In a previous work, Haghifam et al. proposed a different bound combining the two aforementioned techniques, which we refer to as the conditional individual mutual information (CIMI) bound. However, in a simple Gaussian setting, both the CMI and the CIMI bounds are order-wise worse than that by Bu et al.. This observation motivated us to propose the new bound, which overcomes this issue by reducing the conditioning terms in the conditional mutual information. In the process of establishing this bound, a conditional decoupling lemma is established, which also leads to a meaningful dichotomy and comparison among these information-theoretic bounds.