Recently low displacement rank (LDR) matrices, or so-called structured matrices, have been proposed to compress large-scale neural networks. Empirical results have shown that neural networks with weight matrices of LDR matrices, referred as LDR neural networks, can achieve significant reduction in space and computational complexity while retaining high accuracy. This paper gives theoretical study on LDR neural networks. First, we prove the universal approximation property of LDR neural networks with a mild condition on the displacement operators. We then show that the error bounds of LDR neural networks are as efficient as general neural networks with both single-layer and multiple-layer structure. Finally, we propose back-propagation based training algorithm for general LDR neural networks.
Liang Zhao (The City University of New York)
Zhe Li (Syracuse University)
Jian Tang (Syracuse University)
Bo Yuan (City College of New York, CUNY)
Related Events (a corresponding poster, oral, or spotlight)
2017 Talk: Theoretical Properties for Neural Networks with Weight Matrices of Low Displacement Rank »
Mon Aug 7th 04:42 -- 05:00 AM Room C4.8