Timezone: »

Efficient Orthogonal Parametrisation of Recurrent Neural Networks Using Householder Reflections
zakaria mhammedi · Andrew Hellicar · James Bailey · Ashfaqur Rahman

Mon Aug 07 05:30 PM -- 05:48 PM (PDT) @ Parkside 1

The problem of learning long-term dependencies in sequences using Recurrent Neural Networks (RNNs) is still a major challenge. Recent methods have been suggested to solve this problem by constraining the transition matrix to be unitary during training which ensures that its norm is equal to one and prevents exploding gradients. These methods either have limited expressiveness or scale poorly with the size of the network when compared with the simple RNN case, especially when using stochastic gradient descent with a small mini-batch size. Our contributions are as follows; we first show that constraining the transition matrix to be unitary is a special case of an orthogonal constraint. Then we present a new parametrisation of the transition matrix which allows efficient training of an RNN while ensuring that the matrix is always orthogonal. Our results show that the orthogonal constraint on the transition matrix applied through our parametrisation gives similar benefits to the unitary constraint, without the time complexity limitations.

Author Information

Zakaria mhammedi (The University of Melbourne)
Andrew Hellicar (CSIRO)
James Bailey (The University of Melbourne)
Ashfaqur Rahman (CSIRO)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors