Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Structured Probabilistic Inference and Generative Modeling

Diffusion Probabilistic Models Generalize when They Fail to Memorize

TaeHo Yoon · Joo Young Choi · Sehyun Kwon · Ernest Ryu

Keywords: [ Diffusion models; Diffusion probabilistic models; Data replication; Memorization; Generalization ]


Abstract:

In this work, we study the training of diffusion probabilistic models through a series of hypotheses and carefully designed experiments. We call our key finding the memorization-generalization dichotomy, and it asserts that generalization and memorization are mutually exclusive phenomena. This contrasts with the modern wisdom of supervised learning that deep neural networks exhibit "benign" overfitting and generalize well despite overfitting the data.

Chat is not available.