Timezone: »

 
Workshop
HiLD: High-dimensional Learning Dynamics Workshop
Courtney Paquette · Zhenyu Liao · Mihai Nica · Elliot Paquette · Andrew Saxe · Rene Vidal

Fri Jul 28 12:00 PM -- 08:00 PM (PDT) @ Meeting Room 315
Event URL: https://sites.google.com/view/hidimlearning/home »

Modern applications of machine learning seek to extract insights from high-dimensional datasets. The goal of the High-dimensional Learning Dynamics (HiLD) Workshop is to predict and analyze the dynamics of learning algorithms when the number of samples and parameters are large. This workshop seeks to spur research and collaboration around:

1. Developing analyzable models and dynamics to explain observed deep neural network phenomena;
2. Creating mathematical frameworks for scaling limits of neural network dynamics as width and depth grow, which often defy low-dimensional geometric intuitions;
3. The role of overparameterization and how this leads to conserved quantities in the dynamics and the emergence of geometric invariants, with links to Noether's theorem, etc;
4. Provable impacts of the choice of optimization algorithm, hyper-parameters, and neural network architectures on training/test dynamics.

HiLD Workshop aims to bring together experts from classical random matrix theory, optimization, high-dimensional statistics/probability, and statistical physics to share their perspectives while leveraging crossover experts in ML. It seeks to create synergies between these two groups which often do not interact. Through a series of talks, poster sessions, and panel discussions, the workshop will tackle questions on dynamics of learning algorithms at the interface of random matrix theory, high-dimensional statistics, SDEs, and ML.

Author Information

Courtney Paquette (McGill University/Google DeepMind)
Zhenyu Liao (Huazhong University of Science and Technology)

Zhenyu Liao received his Ph.D. in applied math and informatics in 2019 from [University of Paris-Saclay](https://www.universite-paris-saclay.fr/en), France. In 2020 he was a postdoctoral researcher with the Department of Statistics, University of California, Berkeley. He is currently an assistant professor at [Huazhong University of Science and Technology](http://english.hust.edu.cn/), China. His research interests are broadly in machine learning, signal processing, random matrix theory, and high-dimensional statistics. He co-authored the book "[Random Matrix Methods for Machine Learning](https://www.cambridge.org/cn/academic/subjects/computer-science/pattern-recognition-and-machine-learning/random-matrix-methods-machine-learning?format=HB&isbn=9781009123235)."

Mihai Nica (University of Guelph)
Elliot Paquette (McGill University)
Andrew Saxe (University College London)
Rene Vidal (University of Pennsylvania)

More from the Same Authors