Distributed Learning of Neural Networks with One Round of Communication (2019)

by Mike Izbicki and Christian R. Shelton

Abstract: The optimal weighted average (OWA) is an algorithm for distributed learning of linear models. It achieves statistically optimal theoretical guarantees with only a single round of communication. This paper introduces the non-linear OWA (NOWA) algorithm, which extends the linear OWA into the non-linear setting of neural networks. Due to the difficulty of proving theoretical results in this more complex setting, NOWA loses the theoretical guarantees of the OWA algorithm. Nevertheless, we show that NOWA works well empirically. We follow an evaluation procedure introduced by McMahan et al. for federated learning and show significantly improved results on a simple MNIST baseline task.

Download Information

Mike Izbicki and Christian R. Shelton (2019). "Distributed Learning of Neural Networks with One Round of Communication." 2nd International Workshop on Decentralized Machine Learning at the Edge (DMLE'19). pdf        

Bibtex citation

@inproceedings{IzbShe19b,
   author = "Mike Izbicki and Christian R. Shelton",
   title = "Distributed Learning of Neural Networks with One Round of Communication",
   booktitle = "2nd International Workshop on Decentralized Machine Learning at the Edge (DMLE'19)",
   booktitleabbr = "DMLE",
   year = 2019,
}

full list