arxivst stuff from arxiv that you should probably bookmark

Fully Distributed and Asynchronized Stochastic Gradient Descent for Networked Systems

Abstract · Apr 13, 2017 04:58 ·

workers worker server variable synchronization design convex convergence sgd cs-lg cs-ai cs-pf

Arxiv Abstract

  • Ying Zhang

This paper considers a general data-fitting problem over a networked system, in which many computing nodes are connected by an undirected graph. This kind of problem can find many real-world applications and has been studied extensively in the literature. However, existing solutions either need a central controller for information sharing or requires slot synchronization among different nodes, which increases the difficulty of practical implementations, especially for a very large and heterogeneous system. As a contrast, in this paper, we treat the data-fitting problem over the network as a stochastic programming problem with many constraints. By adapting the results in a recent paper, we design a fully distributed and asynchronized stochastic gradient descent (SGD) algorithm. We show that our algorithm can achieve global optimality and consensus asymptotically by only local computations and communications. Additionally, we provide a sharp lower bound for the convergence speed in the regular graph case. This result fits the intuition and provides guidance to design a `good’ network topology to speed up the convergence. Also, the merit of our design is validated by experiments on both synthetic and real-world datasets.

Read the paper (pdf) »