Poster
Kernelized Synaptic Weight Matrices
Lorenz Müller · Julien Martel · Giacomo Indiveri
Hall B #218
[
Abstract
]
Abstract:
In this paper we introduce a novel neural network architecture, in which weight matrices are re-parametrized in terms of low-dimensional vectors, interacting through kernel functions. A layer of our network can be interpreted as introducing a (potentially infinitely wide) linear layer between input and output. We describe the theory underpinning this model and validate it with concrete examples, exploring how it can be used to impose structure on neural networks in diverse applications ranging from data visualization to recommender systems. We achieve state-of-the-art performance in a collaborative filtering task (MovieLens).
Live content is unavailable. Log in and register to view live content