Timezone: »

 
Poster
Maximum Mean Discrepancy Test is Aware of Adversarial Attacks
Ruize Gao · Feng Liu · Jingfeng Zhang · Bo Han · Tongliang Liu · Gang Niu · Masashi Sugiyama

Thu Jul 22 09:00 PM -- 11:00 PM (PDT) @

The maximum mean discrepancy (MMD) test could in principle detect any distributional discrepancy between two datasets. However, it has been shown that the MMD test is unaware of adversarial attacks--the MMD test failed to detect the discrepancy between natural data and adversarial data. Given this phenomenon, we raise a question: are natural and adversarial data really from different distributions? The answer is affirmative--the previous use of the MMD test on the purpose missed three key factors, and accordingly, we propose three components. Firstly, the Gaussian kernel has limited representation power, and we replace it with an effective deep kernel. Secondly, the test power of the MMD test was neglected, and we maximize it following asymptotic statistics. Finally, adversarial data may be non-independent, and we overcome this issue with the help of wild bootstrap. By taking care of the three factors, we verify that the MMD test is aware of adversarial attacks, which lights up a novel road for adversarial data detection based on two-sample tests.

Author Information

Ruize Gao (Hong Kong Baptist University)
Feng Liu (University of Technology Sydney)
Jingfeng Zhang (RIKEN)
Bo Han (HKBU / RIKEN)
Tongliang Liu (The University of Sydney)
Gang Niu (RIKEN)
Gang Niu

Gang Niu is currently an indefinite-term research scientist at RIKEN Center for Advanced Intelligence Project.

Masashi Sugiyama (RIKEN / The University of Tokyo)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors