UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Soft BIBD and product gradient codes Sakorikar, Animesh

Abstract

Due to recent increases in the size of available training data, a variety of machine learning tasks are distributed across multiple computing nodes. However, the theoretical speedup from distributing computations may not be achieved in practice due to slow or unresponsive computing nodes, known as stragglers. Gradient coding is a coding theoretic framework to provide robustness against stragglers in distributed machine learning applications. Recently, Kadhe et al. proposed a gradient code based on a combinatorial design, called balanced incomplete block design (BIBD), which is shown to outperform many existing gradient codes in worst-case straggling scenarios [1]. However, parameters for which such BIBD constructions exist are very limited [2]. In this paper, we aim to overcome such limitations and construct gradient codes which exist for a wide range of system parameters while retaining the superior performance of BIBD gradient codes. Two such constructions are proposed, one based on a probabilistic construction that relax the stringent BIBD gradient code constraints, and the other based on taking the Kronecker product of existing gradient codes. The proposed gradient codes allow flexible choices of system parameters while retaining comparable error performance.

Item Citations and Data

Rights

Attribution-NonCommercial-NoDerivatives 4.0 International