Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Localized Learning: Decentralized Model Updates via Non-Global Objectives

Learning Recurrent Models with Temporally Local Rules

Azwar Abdulsalam · Joseph Makin

Keywords: [ Local Learning ] [ backprop-through-time ] [ sufficient statistics ] [ VAE ] [ RBM ] [ Generative Models ]


Abstract:

Fitting generative models to sequential data typically involves two recursive computations through time, one forward and one backward.The latter could be a computation of the loss gradient (as in backpropagation through time), or an inference algorithm (as in the RTS/Kalman smoother).The backward pass in particular is computationally expensive (since it is inherently serial and cannot exploit GPUs), and difficult to map onto biological processes.Work-arounds have been proposed; here we explore a very different one:\ requiring the generative model to learn the joint distribution over current and previous states, rather than merely the transition probabilities.We show on toy datasets that different architectures employing this principle can learn aspects of the data typically requiring the backward pass.

Chat is not available.