Timezone: »

 
Poster
Are Diffusion Models Vulnerable to Membership Inference Attacks?
Jinhao Duan · Fei Kong · Shiqi Wang · Xiaoshuang Shi · Kaidi Xu

Tue Jul 25 05:00 PM -- 06:30 PM (PDT) @ Exhibit Hall 1 #519

Diffusion-based generative models have shown great potential for image synthesis, but there is a lack of research on the security and privacy risks they may pose. In this paper, we investigate the vulnerability of diffusion models to Membership Inference Attacks (MIAs), a common privacy concern. Our results indicate that existing MIAs designed for GANs or VAE are largely ineffective on diffusion models, either due to inapplicable scenarios (e.g., requiring the discriminator of GANs) or inappropriate assumptions (e.g., closer distances between synthetic samples and member samples). To address this gap, we propose Step-wise Error Comparing Membership Inference (SecMI), a query-based MIA that infers memberships by assessing the matching of forward process posterior estimation at each timestep. SecMI follows the common overfitting assumption in MIA where member samples normally have smaller estimation errors, compared with hold-out samples. We consider both the standard diffusion models, e.g., DDPM, and the text-to-image diffusion models, e.g., Latent Diffusion Models and Stable Diffusion. Experimental results demonstrate that our methods precisely infer the membership with high confidence on both of the two scenarios across multiple different datasets. Code is available at https://github.com/jinhaoduan/SecMI.

Author Information

Jinhao Duan (Drexel University)
Fei Kong (University of Electronic Science and Technology of China)
Shiqi Wang (Columbia University)

I am a second year Ph.D. student in the Department of Computer Science at Columbia University, advised by Professor Suman Jana. I am currently interested in improving the reliability and robustness of machine learning systems.

Xiaoshuang Shi (University of Electronic Science and Technology of China)
Kaidi Xu (Drexel University)

More from the Same Authors