- Library Home /
- Search Collections /
- Open Collections /
- Browse Collections /
- UBC Theses and Dissertations /
- Soft BIBD and product gradient codes
Open Collections
UBC Theses and Dissertations
UBC Theses and Dissertations
Soft BIBD and product gradient codes Sakorikar, Animesh
Abstract
Due to recent increases in the size of available training data, a variety of machine learning tasks are distributed across multiple computing nodes. However, the theoretical speedup from distributing computations may not be achieved in practice due to slow or unresponsive computing nodes, known as stragglers. Gradient coding is a coding theoretic framework to provide robustness against stragglers in distributed machine learning applications. Recently, Kadhe et al. proposed a gradient code based on a combinatorial design, called balanced incomplete block design (BIBD), which is shown to outperform many existing gradient codes in worst-case straggling scenarios [1]. However, parameters for which such BIBD constructions exist are very limited [2]. In this paper, we aim to overcome such limitations and construct gradient codes which exist for a wide range of system parameters while retaining the superior performance of BIBD gradient codes. Two such constructions are proposed, one based on a probabilistic construction that relax the stringent BIBD gradient code constraints, and the other based on taking the Kronecker product of existing gradient codes. The proposed gradient codes allow flexible choices of system parameters while retaining comparable error performance.
Item Metadata
Title |
Soft BIBD and product gradient codes
|
Creator | |
Supervisor | |
Publisher |
University of British Columbia
|
Date Issued |
2022
|
Description |
Due to recent increases in the size of available training data, a variety of machine learning tasks are distributed across multiple computing nodes. However, the theoretical speedup from distributing computations may not be achieved in practice due to slow or unresponsive computing nodes, known as stragglers. Gradient coding is a coding theoretic framework to provide robustness against stragglers in distributed machine learning applications. Recently, Kadhe et al. proposed a gradient code based on a combinatorial design, called balanced incomplete block design (BIBD), which is shown to outperform many existing gradient codes in worst-case straggling scenarios [1]. However, parameters for which such BIBD constructions exist are very limited [2]. In this paper, we aim to overcome such limitations and construct gradient codes which exist for a wide range of system parameters while retaining the superior performance of BIBD gradient codes. Two such constructions are proposed, one based on a probabilistic construction that relax the stringent BIBD gradient code constraints, and the other based on taking the Kronecker product of existing gradient codes. The proposed gradient codes allow flexible choices of system parameters while retaining comparable error performance.
|
Genre | |
Type | |
Language |
eng
|
Date Available |
2022-05-18
|
Provider |
Vancouver : University of British Columbia Library
|
Rights |
Attribution-NonCommercial-NoDerivatives 4.0 International
|
DOI |
10.14288/1.0413643
|
URI | |
Degree | |
Program | |
Affiliation | |
Degree Grantor |
University of British Columbia
|
Graduation Date |
2022-11
|
Campus | |
Scholarly Level |
Graduate
|
Rights URI | |
Aggregated Source Repository |
DSpace
|
Item Media
Item Citations and Data
Rights
Attribution-NonCommercial-NoDerivatives 4.0 International