Oral
in
Workshop: Sampling and Optimization in Discrete Space
Topological Neural Discrete Representation Learning à la Kohonen
Kazuki Irie · Robert Cordas · Jürgen Schmidhuber
Unsupervised learning of discrete representations in neural networks (NNs) from continuous ones is essential for many modern applications. Vector Quantisation (VQ) has become popular for this, in particular in the context of generative models such as Variational Auto-Encoders (VAEs), where the exponential moving average-based VQ (EMA-VQ) algorithm is often used. Here we study an alternative VQ algorithm based on Kohonen's learning rule for the Self-Organising Map (KSOM; 1982), a classic VQ algorithm known to offer two potential benefits over its special case EMA-VQ: empirically, KSOM converges faster than EMA-VQ, and KSOM-generated discrete representations form a topological structure on the grid whose nodes are the discrete symbols, resulting in an artificial version of the brain's topographic map. We revisit these properties by using KSOM in VQ-VAEs for image processing. In our experiments, the speed-up compared to well-configured EMA-VQ is only observable at the beginning of training, but KSOM is generally much more robust, e.g., w.r.t. the choice of initialisation schemes.