Workshop on Weight-Space Symmetries: from Foundations to Practical Applications
Abstract
Neural networks are highly over-parameterized models whose weight spaces exhibit rich symmetries, for example, neuron permutations. These symmetries create large equivalence classes of functionally identical solutions and have profound implications for the structure of the loss landscape, optimization, and design of practical algorithms. While significant progress has been made in characterizing these symmetries and their effects, a unified understanding remains elusive. Simultaneously, there is growing interest in practical applications of weight-space symmetries, such as training acceleration, model merging, weight-space learning, and more. The goal of this workshop is to bring together researchers from academia and industry to translate theoretical advances in weight-space symmetries into practical, scalable methods, fostering a coherent framework and highlighting approaches that are computationally feasible at scale.