Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Differentiable Almost Everything: Differentiable Relaxations, Algorithms, Operators, and Simulators

BPNAS: Bayesian Progressive Neural Architecture Search

Hyunwoong Chang · Anirban Samaddar · Sandeep Madireddy

Keywords: [ network ensemble search ] [ differentiable NAS ] [ Network architecture search ]


Abstract:

In the performance landscape of the multiple NAS benchmarks, only a few operations contribute to higher performance while others have detrimental effects. This motivates tailoring a posterior distribution by imposing a higher prior quantity on a sparser supernetwork to progressively prune unimportant operations. Moreover, the Bayesian scheme enables the straightforward generation of architecture samples when provided with an estimated architecture from any NAS method. To that end, we propose BPNAS, a Bayesian progressive neural architecture search (NAS) method under the differentiable NAS framework that combines recent advances in the differentiable NAS framework with Bayesian inference adopting sparse prior on network architecture for faster convergence and uncertainty quantification in architecture search. With numerical experiments on the popular NAS search space, we show that BPNAS improves the accuracy and convergence speed compared to state-of-the-art NAS approaches on benchmark datasets.

Chat is not available.