Skip to yearly menu bar Skip to main content


Poster

PruNeRF: Segment-Centric Dataset Pruning via 3D Spatial Consistency

Yeonsung Jung · Heecheol Yun · Joonhyung Park · Jin-Hwa Kim · Eunho Yang


Abstract:

Neural Radiance Fields (NeRF) is a method for 3D scene modeling that employs fully-connected networks to learn 3D geometric information and synthesize high-quality novel views. However, NeRF exhibits vulnerability in preserving 3D consistency when confronted with distractors in the training images -- unexpected objects are present only within specific views, such as moving entities like pedestrians or birds. A straightforward solution is to exclude distractors during the dataset construction phase. Nevertheless, without prior knowledge of the types and quantities of distractors, excluding distractors from multi-view images is extremely costly. In this paper, we propose a segment-wise dataset pruning via 3D spatial consistency, which can be easily integrated with other NeRF models in a plug-and-play manner. Our motivation stems from the fact that humans identify distractors in multiple images by assessing spatial consistency in the 3D space. First, to accurately quantify the anomality, we introduce influence functions for the first time in NeRF. Then, we evaluate the 3D spatial consistency using a geometry-based reprojection technique. Furthermore, to enhance the preciseness in identifying distractors not at the pixel-level but at the segment-level, we integrate segmentation. Our empirical results on benchmark datasets demonstrate superior robustness against the presence of distractors compared to the state-of-the-art methods.

Live content is unavailable. Log in and register to view live content