Oral
Coded Sparse Matrix Multiplication
Sinong Wang · Jiashang Liu · Ness Shroff

Wed Jul 11th 11:30 -- 11:40 AM @ A9

In a large-scale and distributed matrix multiplication problem $C=A^{\intercal}B$, where $C\in\mathbb{R}^{r\times t}$, the coded computation plays an important role to effectively deal with ``stragglers'' (distributed computations that may get delayed due to few slow or faulty processors). However, existing coded schemes could destroy the significant sparsity that exists in large-scale machine learning problems, and could result in much higher computation overhead, i.e., $O(rt)$ decoding time. In this paper, we develop a new coded computation strategy, we call \emph{sparse code}, which achieves near \emph{optimal recovery threshold}, \emph{low computation overhead}, and \emph{linear decoding time} $O(nnz(C))$. We implement our scheme and demonstrate the advantage of the approach over both uncoded and current fastest coded strategies.

Author Information

Sinong Wang (The Ohio State University)
Jiashang Liu (The Ohio State University)
Ness Shroff (The Ohio State University)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors