Skip to yearly menu bar Skip to main content


( events)   Timezone:  
Poster
Fri Jul 13 09:15 AM -- 12:00 PM (PDT) @ Hall B #94
Multi-Fidelity Black-Box Optimization with Hierarchical Partitions
Rajat Sen · kirthevasan kandasamy · Sanjay Shakkottai
[ PDF

Motivated by settings such as hyper-parameter tuning and physical simulations, we consider the problem of black-box optimization of a function. Multi-fidelity techniques have become popular for applications where exact function evaluations are expensive, but coarse (biased) approximations are available at much lower cost. A canonical example is that of hyper-parameter selection in a learning algorithm. The learning algorithm can be trained for fewer iterations -- this results in a lower cost, but its validation error is only coarsely indicative of the same if the algorithm had been trained till completion. We incorporate the multi-fidelity setup into the powerful framework of black-box optimization through hierarchical partitioning. We develop tree-search based multi-fidelity algorithms with theoretical guarantees on simple regret. We finally demonstrate the performance gains of our algorithms on both real and synthetic datasets.