Skip to yearly menu bar Skip to main content


Poster
in
Workshop: High-dimensional Learning Dynamics Workshop: The Emergence of Structure and Reasoning

The optimization landscape of Spectral neural network

Chenghui Li · Rishi Sonthalia · Nicolas Garcia Trillos


Abstract:

There is a large variety of machine learning methodologies that are based on the extraction of spectral geometric information from data. However, the implementations of many of these methods often depend on traditional eigensolvers, which present limitations when applied in practical online big data scenarios. To address some of these challenges, researchers have proposed different strategies for training neural networks as alternatives to traditional eigensolvers, with one such approach known as Spectral Neural Network (SNN). In this paper, we initiate a theoretical exploration of the optimization landscape of SNN's objective to shed light on the training dynamics of SNN. Unlike typical studies of convergence to global solutions of NN training dynamics, SNN presents an additional complexity due to its non-convex ambient loss function, a feature that is common in unsupervised learning settings. We show that the ambient optimization landscape is benign in a quotient geometry. Furthermore, we use the experimental results to see that the parameterized optimization landscape inherits from the benignness of the ambient landscape if the neural network is appropriately overparameterized.

Chat is not available.