Timezone: »

Unbiased Gradient Estimation in Unrolled Computation Graphs with Persistent Evolution Strategies
Paul Vicol · Luke Metz · Jascha Sohl-Dickstein

Tue Jul 20 09:00 PM -- 11:00 PM (PDT) @ Virtual

Unrolled computation graphs arise in many scenarios, including training RNNs, tuning hyperparameters through unrolled optimization, and training learned optimizers. Current approaches to optimizing parameters in such computation graphs suffer from high variance gradients, bias, slow updates, or large memory usage. We introduce a method called Persistent Evolution Strategies (PES), which divides the computation graph into a series of truncated unrolls, and performs an evolution strategies-based update step after each unroll. PES eliminates bias from these truncations by accumulating correction terms over the entire sequence of unrolls. PES allows for rapid parameter updates, has low memory usage, is unbiased, and has reasonable variance characteristics. We experimentally demonstrate the advantages of PES compared to several other methods for gradient estimation on synthetic tasks, and show its applicability to training learned optimizers and tuning hyperparameters.

Author Information

Paul Vicol (University of Toronto)
Luke Metz (Google Brain)
Jascha Sohl-Dickstein (Google Brain)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors