Timezone: »

 
Talk
Beyond Filters: Compact Feature Map for Portable Deep Model
Yunhe Wang · Chang Xu · Chao Xu · Dacheng Tao

Tue Aug 08 06:24 PM -- 06:42 PM (PDT) @ Darling Harbour Theatre

Convolutional neural networks (CNNs) have shown extraordinary performance in a number of applications, but they are usually of heavy design for the accuracy reason. Beyond compressing the filters in CNNs, this paper focuses on the redundancy in the feature maps derived from the large number of filters in a layer. We propose to extract intrinsic representation of the feature maps and preserve the discriminability of the features. Circulant matrix is employed to formulate the feature map transformation, which only requires O(dlog d) computation complexity to embed a d-dimensional feature map. The filter is then re-configured to establish the mapping from original input to the new compact feature map, and the resulting network can preserve intrinsic information of the original network with significantly fewer parameters, which not only decreases the online memory for launching CNN but also accelerates the computation speed. Experiments on benchmark image datasets demonstrate the superiority of the proposed algorithm over state-of-the-art methods.

Author Information

Yunhe Wang (Peking University)
Chang Xu (The University of Sydney)
Chao Xu (Peking University)
Dacheng Tao

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors