Timezone: »
A key challenge of leveraging data augmentation for neural network training is choosing an effective augmentation policy from a large search space of candidate operations. Properly chosen augmentation policies can lead to significant generalization improvements; however, state-of-the-art approaches such as AutoAugment are computationally infeasible to run for an ordinary user. In this paper, we introduce a new data augmentation algorithm, Population Based Augmentation (PBA), which generates augmentation policy schedules orders of magnitude faster than previous approaches. We show that PBA can match the performance of AutoAugment with orders of magnitude less overall compute. On CIFAR-10 we achieve a mean test error of 1.46%, which is slightly better than current state-of-the-art. The code for PBA is fully open source and will be made available.
Author Information
Daniel Ho (UC Berkeley)
Eric Liang (UC Berkeley)
Peter Chen (Covariant.ai)
Ion Stoica (UC Berkeley)
Pieter Abbeel (UC Berkeley)
Related Events (a corresponding poster, oral, or spotlight)
-
2019 Poster: Population Based Augmentation: Efficient Learning of Augmentation Policy Schedules »
Thu. Jun 13th 01:30 -- 04:00 AM Room Pacific Ballroom #134
More from the Same Authors
-
2021 Poster: Resource Allocation in Multi-armed Bandit Exploration: Overcoming Sublinear Scaling with Adaptive Parallelism »
Brijen Thananjeyan · Kirthevasan Kandasamy · Ion Stoica · Michael Jordan · Ken Goldberg · Joseph E Gonzalez -
2021 Oral: Resource Allocation in Multi-armed Bandit Exploration: Overcoming Sublinear Scaling with Adaptive Parallelism »
Brijen Thananjeyan · Kirthevasan Kandasamy · Ion Stoica · Michael Jordan · Ken Goldberg · Joseph E Gonzalez -
2021 Poster: TeraPipe: Token-Level Pipeline Parallelism for Training Large-Scale Language Models »
Zhuohan Li · Siyuan Zhuang · Shiyuan Guo · Danyang Zhuo · Hao Zhang · Dawn Song · Ion Stoica -
2021 Spotlight: TeraPipe: Token-Level Pipeline Parallelism for Training Large-Scale Language Models »
Zhuohan Li · Siyuan Zhuang · Shiyuan Guo · Danyang Zhuo · Hao Zhang · Dawn Song · Ion Stoica -
2020 Poster: Hallucinative Topological Memory for Zero-Shot Visual Planning »
Kara Liu · Thanard Kurutach · Christine Tung · Pieter Abbeel · Aviv Tamar -
2019 Poster: Flow++: Improving Flow-Based Generative Models with Variational Dequantization and Architecture Design »
Jonathan Ho · Peter Chen · Aravind Srinivas · Rocky Duan · Pieter Abbeel -
2019 Oral: Flow++: Improving Flow-Based Generative Models with Variational Dequantization and Architecture Design »
Jonathan Ho · Peter Chen · Aravind Srinivas · Rocky Duan · Pieter Abbeel