Dynamic Sampling and Selective Masking for Communication-Efficient Federated Learning
Shaoxiong Ji*, Wenqi Jiang* (co-first author) , Anwar Walid, Xue Li
Published in arXiv preprint, 2020
Federated learning (FL) is a novel machine learning setting which enables on-device intelligence via decentralized training and federated optimization. The rapid development of deep neural networks facilitates the learning techniques for modeling complex problems and emerges into federated deep learning under the federated setting. However, the tremendous amount of model parameters burdens the communication network with a high load of transportation. This paper introduces two approaches for improving communication efficiency by dynamic sampling and top-k selective masking. The former controls the fraction of selected client models dynamically, while the latter selects parameters with top-k largest values of difference for federated updating. Experiments on convolutional image classification and recurrent language modeling are conducted on three public datasets to show the effectiveness of our proposed methods.