Poster
in
Workshop: Over-parameterization: Pitfalls and Opportunities
Understanding the effect of sparsity on neural networks robustness
Lukas Timpl · Rahim Entezari · Hanie Sedghi · Behnam Neyshabur · Olga Saukh
Abstract:
This paper examines the impact of static sparsity on the robustness of a trained network to weight perturbations, data corruption, and adversarial examples. We show that, up to a certain sparsity achieved by increasing network width and depth while keeping the network capacity fixed, sparsified networks consistently match and often outperform their initially dense versions. Robustness and accuracy decline simultaneously for very high sparsity due to loose connectivity between network layers. Our findings show that a rapid robustness drop caused by network compression observed in the literature is due to a reduced network capacity rather than sparsity.
Chat is not available.