Skip to yearly menu bar Skip to main content


Workshop

HiLD: High-dimensional Learning Dynamics Workshop

Courtney Paquette · Zhenyu Liao · Mihai Nica · Elliot Paquette · Andrew Saxe · Rene Vidal

Meeting Room 315

Modern applications of machine learning seek to extract insights from high-dimensional datasets. The goal of the High-dimensional Learning Dynamics (HiLD) Workshop is to predict and analyze the dynamics of learning algorithms when the number of samples and parameters are large. This workshop seeks to spur research and collaboration around:

1. Developing analyzable models and dynamics to explain observed deep neural network phenomena;
2. Creating mathematical frameworks for scaling limits of neural network dynamics as width and depth grow, which often defy low-dimensional geometric intuitions;
3. The role of overparameterization and how this leads to conserved quantities in the dynamics and the emergence of geometric invariants, with links to Noether's theorem, etc;
4. Provable impacts of the choice of optimization algorithm, hyper-parameters, and neural network architectures on training/test dynamics.

HiLD Workshop aims to bring together experts from classical random matrix theory, optimization, high-dimensional statistics/probability, and statistical physics to share their perspectives while leveraging crossover experts in ML. It seeks to create synergies between these two groups which often do not interact. Through a series of talks, poster sessions, and panel discussions, the workshop will tackle questions on dynamics of learning algorithms at the interface of random matrix theory, high-dimensional statistics, SDEs, and ML.

Chat is not available.
Timezone: America/Los_Angeles

Schedule