Timezone: »
This work presents Dynamic Normalization (DN), which is able to learn arbitrary normalization operations for different convolutional layers in a deep ConvNet. Unlike existing normalization approaches that predefined computations of the statistics (mean and variance), DN learns to estimate them. DN has several appealing benefits. First, it adapts to various networks, tasks, and batch sizes. Second, it can be easily implemented and trained in a differentiable end-to-end manner with merely small number of parameters. Third, its matrix formulation represents a wide range of normalization methods, shedding light on analyzing them theoretically. Extensive studies show that DN outperforms its counterparts in CIFAR10 and ImageNet.
Author Information
Ping Luo (The University of Hong Kong)
Peng Zhanglin (SenseTime)
Shao Wenqi (CUHK)
Zhang ruimao (cuhk)
Ren jiamin (sensetime)
Wu lingyun (sensetime)
Related Events (a corresponding poster, oral, or spotlight)
-
2019 Poster: Differentiable Dynamic Normalization for Learning Deep Representation »
Thu Jun 13th 01:30 -- 04:00 AM Room Pacific Ballroom
More from the Same Authors
-
2020 Poster: Channel Equilibrium Networks for Learning Deep Representation »
Wenqi Shao · Shitao Tang · Xingang Pan · Ping Tan · Xiaogang Wang · Ping Luo -
2017 Poster: Learning Deep Architectures via Generalized Whitened Neural Networks »
Ping Luo -
2017 Talk: Learning Deep Architectures via Generalized Whitened Neural Networks »
Ping Luo