Bayesian Tensor Decomposition with Diffusion Model Prior
Abstract
Low-rank tensor decomposition (TD) is usually effective on clean, fully observed data, but it often degrades under severe missingness or noise. The low-rank constraint alone provides a weak inductive bias, while common handcrafted priors (e.g., sparsity or smoothness) fail to capture rich real-world structures. To compensate for this weak inductive bias under heavy corruption, one would like to inject a learned, data-driven prior; however, the state-of-the-art diffusion models are not readily compatible with current TD and tractable posterior inference. To address these challenges, we introduce DiffBCP, a Bayesian CP decomposition framework that combines a cumulative shrinkage process prior for automatic rank selection with an off-the-shelf pre-trained diffusion model as an implicit prior on the reconstructed tensor. To make posterior inference tractable despite the coupling among the likelihood, low-rank constraint, and diffusion prior, we develop a split Gibbs sampler: CP factors admit conjugate updates, while the diffusion block is sampled via low-rank-guided denoising. A noise-adaptive coupling schedule further reduces sensitivity to hand-tuned annealing. Experiments on image inpainting and denoising, including high-resolution out-of-distribution images, show consistent gains over Bayesian, nonlinear, and plug-and-play TD baselines.