Timezone: »
Locally supervised learning aims to train a neural network based on a local estimation of the global loss function at each decoupled module of the network. Auxiliary networks are typically appended to the modules to approximate the gradient updates based on the greedy local losses. Despite being advantageous in terms of parallelism and reduced memory consumption, this paradigm of training severely degrades the generalization performance of neural networks. In this paper, we propose Periodically Guided local Learning (PGL), which reinstates the global objective repetitively into the local-loss based training of neural networks primarily to enhance the model's generalization capability. We show that a simple periodic guidance scheme begets significant performance gains while having a low memory footprint. We conduct extensive experiments on various datasets and networks to demonstrate the effectiveness of PGL, especially in the configuration with numerous decoupled modules.
Author Information
Hasnain Irshad Bhatti (KAIST)
Jaekyun Moon (KAIST)
More from the Same Authors
-
2022 : Style Balancing and Test-Time Style Shifting for Domain Generalization »
Jungwuk Park · Dong-Jun Han · Soyeong Kim · Jaekyun Moon -
2022 Poster: GenLabel: Mixup Relabeling using Generative Models »
Jy yong Sohn · Liang Shang · Hongxu Chen · Jaekyun Moon · Dimitris Papailiopoulos · Kangwook Lee -
2022 Spotlight: GenLabel: Mixup Relabeling using Generative Models »
Jy yong Sohn · Liang Shang · Hongxu Chen · Jaekyun Moon · Dimitris Papailiopoulos · Kangwook Lee -
2020 Poster: XtarNet: Learning to Extract Task-Adaptive Representation for Incremental Few-Shot Learning »
Sung Whan Yoon · Do-Yeon Kim · Jun Seo · Jaekyun Moon -
2019 Poster: TapNet: Neural Network Augmented with Task-Adaptive Projection for Few-Shot Learning »
Sung Whan Yoon · Jun Seo · Jaekyun Moon -
2019 Oral: TapNet: Neural Network Augmented with Task-Adaptive Projection for Few-Shot Learning »
Sung Whan Yoon · Jun Seo · Jaekyun Moon