Timezone: »

 
Contributed Talk: Discovering Weight Initializers with Meta Learning
Dmitry Baranchuk

Fri Jul 23 09:00 AM -- 09:10 AM (PDT) @ None

Deep neural network training largely depends on the choice of initial weight distribution. However, this choice can often be nontrivial. Existing theoretical results for this problem mostly cover simple architectures, e.g., feedforward networks with ReLU activations. The architectures used for practical problems are more complex and often incorporate many overlapping modules, making them challenging for theoretical analysis. Therefore, practitioners have to use heuristic initializers with questionable optimality and stability. In this study, we propose a task-agnostic approach that discovers initializers for specific network architectures and optimizers by learning the initial weight distributions directly through the use of Meta-Learning. In several supervised and unsupervised learning scenarios, we show the advantage of our initializers in terms of both faster convergence and higher model performance.

link to postersession #2

Author Information

Dmitry Baranchuk (Yandex)

More from the Same Authors