Matrix multiplication is a fundamental building block in various machine learning algorithms. When the matrix comes from a large dataset, the multiplication can be split into multiple tasks which calculate the multiplication of submatrices on different nodes. As some nodes may be stragglers, coding schemes have been proposed to tolerate stragglers in such distributed matrix multiplication. However, existing coding schemes typically split the matrices in only one or two dimensions, limiting their capabilities to handle large-scale matrix multiplication. Three-dimensional coding, however, does not have any code construction that achieves the optimal number of tasks required for decoding, with the best result achieved by entangled polynomial (EP) codes. In this paper, we propose dual entangled polynomial (DEP) codes that require around 25% fewer tasks than EP codes by executing two matrix multiplications on each task. With experiments in a real cloud environment, we show that DEP codes can also save the decoding overhead and memory consumption of tasks.
Pedro Soto (Florida International University)
Jun Li (Florida International University)
Xiaodi Fan (Florida International University)
Related Events (a corresponding poster, oral, or spotlight)
2019 Oral: Dual Entangled Polynomial Code: Three-Dimensional Coding for Distributed Matrix Multiplication »
Tue Jun 11th 02:35 -- 02:40 PM Room Room 102