PCGS: Deblurring 3D Gaussian Splatting with Patch Comparison
Abstract
Recent neural methods, such as 3D Gaussian Splatting, have achieved state-of-the-art rendering quality and speed. However, these methods frequently encounter challenges in regions with overlapping Gaussians, leading to blurring and artifacts in the rendered images. We observed that widely used view-space positional gradients are insufficient for handling such circumstances. To address this, we introduce PCGS, a Patch Comparison Gaussian Splatting method to control the densification of corresponding Gaussians adaptively. Specifically, PCGS divides the rendered image into patches and identifies those with significant errors by comparing the loss between the rendered and ground truth images. Additional densification operations are then applied to the Gaussians in these error-prone regions. Furthermore, to prevent over-densification and redundant Gaussians, we design a Gaussian control strategy to regulate the densification process. Specifically, we set a Gaussian number budget that dynamically changes according to the progress of densification, and sample the Gaussians required for each densification step based on their importance scores. Our method results in significantly fewer artifacts and less blur while maintaining a Gaussian count approximately equal to that of 3DGS. Extensive experiments on multiple standard benchmarks demonstrate the superiority of our approach.