Timezone: »

 
Spotlight
Active Nearest Neighbor Regression Through Delaunay Refinement
Alexander Kravberg · Giovanni Luca Marchetti · Vladislav Polianskii · Anastasiia Varava · Florian T. Pokorny · Danica Kragic

Thu Jul 21 01:45 PM -- 01:50 PM (PDT) @ Room 318 - 320

We introduce an algorithm for active function approximation based on nearest neighbor regression. Our Active Nearest Neighbor Regressor (ANNR) relies on the Voronoi-Delaunay framework from computational geometry to subdivide the space into cells with constant estimated function value and select novel query points in a way that takes the geometry of the function graph into account. We consider the recent state-of-the-art active function approximator called DEFER, which is based on incremental rectangular partitioning of the space, as the main baseline. The ANNR addresses a number of limitations that arise from the space subdivision strategy used in DEFER. We provide a computationally efficient implementation of our method, as well as theoretical halting guarantees. Empirical results show that ANNR outperforms the baseline for both closed-form functions and real-world examples, such as gravitational wave parameter inference and exploration of the latent space of a generative model.

Author Information

Alexander Kravberg (KTH - Royal Institute of Technology)

Postdoctoral researcher bridging chemistry, algorithms, and AI

Giovanni Luca Marchetti (KTH Royal Institute of Technology)
Vladislav Polianskii (KTH Royal Institute of Technology)
Anastasiia Varava (-)
Florian T. Pokorny (KTH Royal Institute of Technology)
Danica Kragic (KTH)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors