Skip to yearly menu bar Skip to main content


Poster
in
Workshop: New Frontiers in Learning, Control, and Dynamical Systems

Tendiffpure: Tensorizing Diffusion Models for Purification

Zhou Derun · Mingyuan Bai · Qibin Zhao


Abstract: Diffusion models are effective purification methods where the noises or adversarial attacks are removed using generative approaches before pre-existing classifiers conducting classification tasks. However, the efficiency of diffusion models is still a concern and existing solutions are based on knowledge distillation which can jeopardize the generation quality because of the small number of generation steps. Hence we propose Tendiffpure as a tensorized diffusion models to compress diffusion models for purification. Unlike the knowledge distillation methods, we directly compress u-nets as backbones of diffusion models using tensor-train decomposition which reduce the number of parameters and captures more spatial information in multi-dimensional data such as images. The space complexity is reduced from $\mathit{O}(N^2)$ to $\mathit{O}(NR^2)$ with $R\leq 4$. Experimental results show that Tendiffpure can more efficiently generate high quality purified results and outperform the baselines purification methods on CIFAR-10, FashionMNIST and MNIST datasets for two noises and one adversarial attack.

Chat is not available.