Poster
in
Workshop: Localized Learning: Decentralized Model Updates via Non-Global Objectives
Energy-Based Learning Algorithms: A Comparative Study
Benjamin Scellier · Maxence Ernoult · Jack Kendall · Suhas Kumar
Keywords: [ equilibrium propagation ] [ energy-based learning algorithm ] [ energy-based model ] [ coupled learning ] [ Contrastive Learning ] [ deep Hopfield network ]
This work compares seven energy-based learning algorithms, namely contrastive learning (CL), equilibrium propagation (EP), coupled learning (CpL) and different variants of these algorithms depending on the type of perturbation used. The algorithms are compared on deep convolutional Hopfield networks (DCHNs) and evaluated on five vision tasks (MNIST, Fashion-MNIST, SVHN, CIFAR-10 and CIFAR-100). The results reveal that while all algorithms perform similarly on the simplest task (MNIST), differences in performance become evident as task complexity increases. Perhaps surprisingly, we find that negative perturbations yield significantly better results than positive ones, and the centered variant of EP emerges as the top-performing algorithm. Lastly, we report new state-of-the-art DCHN simulations on all five datasets (both in terms of speed and accuracy), achieving a 13.5x speedup compared to Laborieux et al. (2021).