|| Adaptive Label Smoothing for Classifier-based Mutual Information Neural Estimation
||Xu Wang, City University of Hong Kong, Hong Kong SAR of China; Ali Al-Bashabsheh, Beihang University, China; Chao Zhao, Chung Chan, City University of Hong Kong, Hong Kong SAR of China|
||D3-S1-T3: Neural Estimation
||Wednesday, 14 July, 22:00 - 22:20
||Wednesday, 14 July, 22:20 - 22:40
Estimating mutual information (MI) by neural networks has achieved significant practical success, especially in representation learning. Recent results further reduced the variance in the neural estimation by training a probabilistic classifier. However, the trained classifier tends to be overly confident about some of its predictions, which results in an overestimated MI that fails to capture the desired representation. To soften the classifier, we propose a novel scheme that smooths the label adaptively according to how extreme the probability estimates are. The resulting MI estimate is unbiased under only mild assumptions on the model. Experimental results on MNIST and CIFAR10 datasets confirmed that our method yields better representation and achieves higher classification test accuracy among existing approaches in self-supervised representation learning.