Tutorial
Sparsity in Deep Learning: Pruning and growth for efficient inference and training
Torsten Hoefler · Dan Alistarh
Abstract:
This tutorial will perform an detailed overview of the work on sparsity in deep learning, covering sparsifi- cation techniques for neural networks, from both the mathematical and implementation perspectives. We specifically aim to cover the significant recent advances in the area, and put them in the context of the foundational work performed on this topic in the 1990s.
Chat is not available.
Schedule
Mon 8:00 a.m. - 8:40 a.m.
|
Part 1: Introduction to Sparsity in Deep Learning
(
Presentation
)
>
SlidesLive Video |
Torsten Hoefler 🔗 |
Mon 8:45 a.m. - 9:30 a.m.
|
Part 2: Mathematical Background
(
Presentation
)
>
SlidesLive Video |
Dan Alistarh 🔗 |
Mon 9:30 a.m. - 10:00 a.m.
|
Part 3: Case study on CNN sparsification on ImageNet
(
Presentation
)
>
SlidesLive Video |
Dan Alistarh 🔗 |
Mon 10:00 a.m. - 10:40 a.m.
|
Part 4: Ephemeral Sparsity, Sparse Training, and Conclusions
(
Presentation
)
>
SlidesLive Video |
Torsten Hoefler 🔗 |
Mon 10:45 a.m. - 10:59 a.m.
|
Extended Q&A
(
Discussion
)
>
|
🔗 |