Timezone: »

 
Spotlight
Learning Diverse-Structured Networks for Adversarial Robustness
Xuefeng Du · Jingfeng Zhang · Bo Han · Tongliang Liu · Yu Rong · Gang Niu · Junzhou Huang · Masashi Sugiyama

Thu Jul 22 05:25 PM -- 05:30 PM (PDT) @

In adversarial training (AT), the main focus has been the objective and optimizer while the model has been less studied, so that the models being used are still those classic ones in standard training (ST). Classic network architectures (NAs) are generally worse than searched NA in ST, which should be the same in AT. In this paper, we argue that NA and AT cannot be handled independently, since given a dataset, the optimal NA in ST would be no longer optimal in AT. That being said, AT is time-consuming itself; if we directly search NAs in AT over large search spaces, the computation will be practically infeasible. Thus, we propose diverse-structured network (DS-Net), to significantly reduce the size of the search space: instead of low-level operations, we only consider predefined atomic blocks, where an atomic block is a time-tested building block like the residual block. There are only a few atomic blocks and thus we can weight all atomic blocks rather than find the best one in a searched block of DS-Net, which is an essential tradeoff between exploring diverse structures and exploiting the best structures. Empirical results demonstrate the advantages of DS-Net, i.e., weighting the atomic blocks.

Author Information

Xuefeng Du (University of Wisconsin-Madison)
Jingfeng Zhang (RIKEN)
Bo Han (HKBU / RIKEN)
Tongliang Liu (The University of Sydney)
Yu Rong (Tencent AI Lab)
Gang Niu (RIKEN)
Gang Niu

Gang Niu is currently an indefinite-term senior research scientist at RIKEN Center for Advanced Intelligence Project.

Junzhou Huang (University of Texas at Arlington / Tencent AI Lab)
Masashi Sugiyama (RIKEN / The University of Tokyo)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors