Optimal Kronecker-Sum Approximation of Real Time Recurrent Learning
Frederik Benzing · Marcelo Matheus Gauy · Asier Mujika · Anders Martinsson · Angelika Steger

Thu Jun 13th 12:05 -- 12:10 PM @ Room 102

One of the central goals of Recurrent Neural Networks (RNNs) is to learn long-term dependencies in sequential data. Nevertheless, the most popular training method, Truncated Backpropagation through Time (TBPTT), categorically forbids learning dependencies beyond the truncation horizon. In contrast, the online training algorithm Real Time Recurrent Learning (RTRL) provides untruncated gradients, with the disadvantage of impractically large computational costs. Recently published approaches reduce these costs by providing noisy approximations of RTRL. We present a new approximation algorithm of RTRL, Optimal Kronecker-Sum Approximation (OK). We prove that OK is optimal for a class of approximations of RTRL, which includes all approaches published so far. Additionally, we show that OK has empirically negligible noise: Unlike previous algorithms it matches TBPTT in a real world task (character-level Penn TreeBank) and can exploit online parameter updates to outperform TBPTT in a synthetic string memorization task.

Author Information

Frederik Benzing (ETH Zurich)
Marcelo Matheus Gauy (ETH Zurich)
Asier Mujika (ETH Zurich)
Anders Martinsson (ETH Zurich)
Angelika Steger (ETH Zurich)

Related Events (a corresponding poster, oral, or spotlight)