One-step Optimal Transport via Regularized Distribution Matching Distillation
Abstract
Unpaired domain translation remains a challenging task due to the need of finding a balance between faithfulness and realism. In this paper, we propose a method called Regularized Distribution Matching Distillation (RDMD) that combines the best properties of Optimal Transport (OT) and diffusion-based domain translation methods. Instead of the conventional adversarial training, RDMD utilizes diffusion-based distribution matching, addressing the common shortcomings of OT methods and providing a strong initialization for the trained models. RDMD provides efficient one-step inference, explicitly controls the input-output alignment via regularization of the transport cost and maintains high faithfulness similar to the OT methods. We prove that in theory RDMD approximates the OT map and demonstrate its empirical performance on several tasks, including unpaired image-to-image translation in pixel and latent space and unpaired text detoxification. Empirical results show that RDMD achieves a comparable or better faithfulness-realism trade-off compared to the diffusion and OT baselines.