Skip to yearly menu bar Skip to main content


Poster

Do Topological Characteristics Help in Knowledge Distillation?

Jungeun Kim · Junwon You · Dongjin Lee · Ha Young Kim · Jae-Hun Jung

Hall C 4-9 #806
[ ] [ Project Page ] [ Paper PDF ]
[ Slides [ Poster
Thu 25 Jul 4:30 a.m. PDT — 6 a.m. PDT

Abstract:

Knowledge distillation (KD) aims to transfer knowledge from larger (teacher) to smaller (student) networks. Previous studies focus on point-to-point or pairwise relationships in embedding features as knowledge and struggle to efficiently transfer relationships of complex latent spaces. To tackle this issue, we propose a novel KD method called TopKD, which considers the global topology of the latent spaces. We define global topology knowledge using the persistence diagram (PD) that captures comprehensive geometric structures such as shape of distribution, multiscale structure and connectivity, and the topology distillation loss for teaching this knowledge. To make the PD transferable within reasonable computational time, we employ approximated persistence images of PDs. Through experiments, we support the benefits of using global topology as knowledge and demonstrate the potential of TopKD. Code is available at https://github.com/jekim5418/TopKD

Chat is not available.