Poster
in
Workshop: Structured Probabilistic Inference and Generative Modeling
Non-Normal Diffusion Models
Henry Li
Keywords: [ Diffusion Models ] [ generative modeling ]
Abstract:
Diffusion models generate samples by incrementally reversing a process that turns data into noise. We show that when the step size goes to zero, the reversed process is invariant to the distribution of these increments. This reveals a previously unconsidered parameter in the design of diffusion models: the distribution of the diffusion step $\boldsymbol \Delta \mathbf{x}_k = \mathbf{x}_k - \mathbf{x}_{k + 1}$. This parameter is implicitly set by default to be normally distributed in most diffusion models. By lifting this assumption, we generalize the framework for designing diffusion models and establish an expanded class of diffusion processes with greater flexibility in the choice of loss function used during training. We demonstrate the effectiveness of these models on density estimation and generative modeling tasks on standard image datasets, and show that different choices of the distribution of $\boldsymbol\Delta \mathbf{x}_k$ result in qualitatively different generated samples.
Chat is not available.