UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Communcation-efficient algorithms for decentralized multi-task learning Kuang, Yao

Abstract

Distributed optimization requires nodes to coordinate, yet full synchronization scales poorly. When multiple nodes collaborate through pairwise regularizers, standard methods require a number of communications proportional to the total number of regularizers per iteration. We propose randomized local coordination, in which each node independently samples one regularizer uniformly and coordinates only with nodes sharing that term. This approach exploits partial separability, where each regularizer depends on a subset of nodes. For graph-guided regularizers, where each regularizer depends on a pair of nodes, expected communication drops to exactly 2 messages per iteration regardless of the network topology. By replacing the proximal map of the sum with the proximal map of a single randomly selected regularizer, the method preserves convergence while eliminating global coordination. We show that this method achieves a convergence rate comparable to that of centralized stochastic gradient descent in various settings. Experiments validate both convergence rates and communication efficiency across synthetic and real-world datasets.

Item Media

Item Citations and Data

Rights

Attribution-NonCommercial-NoDerivatives 4.0 International