Timezone: »

 
Poster
Efficient Neural Architecture Search via Parameters Sharing
Hieu Pham · Melody Guan · Barret Zoph · Quoc Le · Jeff Dean

Wed Jul 11 09:15 AM -- 12:00 PM (PDT) @ Hall B #185

We propose Efficient Neural Architecture Search (ENAS), a fast and inexpensive approach for automatic model design. ENAS constructs a large computational graph, where each subgraph represents a neural network architecture, hence forcing all architectures to share their parameters. A controller is trained with policy gradient to search for a subgraph that maximizes the expected reward on a validation set. Meanwhile a model corresponding to the selected subgraph is trained to minimize a canonical cross entropy loss. Sharing parameters among child models allows ENAS to deliver strong empirical performances, whilst using much fewer GPU-hours than existing automatic model design approaches, and notably, 1000x less expensive than standard Neural Architecture Search. On Penn Treebank, ENAS discovers a novel architecture that achieves a test perplexity of 56.3, on par with the existing state-of-the-art among all methods without post-training processing. On CIFAR-10, ENAS finds a novel architecture that achieves 2.89% test error, which is on par with the 2.65% test error of NASNet (Zoph et al., 2018).

Author Information

Hieu Pham (Carnegie Mellon University)
Melody Guan (Stanford University)
Barret Zoph (Google)
Quoc Le (Google Brain)
Jeff Dean (Google Brain)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors