All Dates/Times are Australian Eastern Standard Time (AEST)

Technical Program

Paper Detail

Paper IDD1-S5-T3.1
Paper Title An Information Theoretic Framework for Distributed Learning Algorithms
Authors Xiangxiang Xu, Shao-Lun Huang, Tsinghua-Berkeley Shenzhen Institute, China
Session D1-S5-T3: Distributed Learning
Chaired Session: Monday, 12 July, 23:20 - 23:40
Engagement Session: Monday, 12 July, 23:40 - 00:00
Abstract Distributed learning is recently an important research topic, while the information theoretic optimality of the distributed learning algorithms is often not sufficiently addressed. This paper studies the distributed learning problems such that each node observes i.i.d. samples and sends a feature function of observed samples to the central machine for decision making. Both the binary hypothesis testing in information theory and the classification problems in machine learning are considered, and the optimal error exponent and the set of optimal features are characterized. By exploiting an information theoretic framework, we show that these two problems share the same set of optimal features, from which the information theoretic optimality of some machine learning algorithms can be established. Finally, we generalize our analyses to $M$-ary distributed hypothesis testing and classification problems.