Skip to yearly menu bar Skip to main content


Poster

Nearest Neighbour Score Estimators for Diffusion Generative Models

Matthew Niedoba · Dylan Green · Saeid Naderiparizi · Vasileios Lioutas · Jonathan Lavington · Xiaoxuan Liang · Yunpeng Liu · Ke Zhang · Setareh Dabiri · Adam Scibior · Berend Zwartsenberg · Frank Wood

Hall C 4-9 #417
[ ] [ Paper PDF ]
[ Poster
Wed 24 Jul 4:30 a.m. PDT — 6 a.m. PDT

Abstract:

Score function estimation is the cornerstone of both training and sampling from diffusion generative models. Despite this fact, the most commonly used estimators are either biased neural network approximations or high variance Monte Carlo estimators based on the conditional score. We introduce a novel nearest neighbour score function estimator which utilizes multiple samples from the training set to dramatically decrease estimator variance. We leverage our low variance estimator in two compelling applications. Training consistency models with our estimator, we report a significant increase in both convergence speed and sample quality. In diffusion models, we show that our estimator can replace a learned network for probability-flow ODE integration, opening promising new avenues of future research. Code will be released upon paper acceptance.

Chat is not available.