Skip to yearly menu bar Skip to main content


Invited Talk

Towards a Mathematical Theory of Machine Learning

Weinan E

Hall F

Abstract:

Given a machine learning model, what are the class of functions that can be approximated by this particular model efficiently, in the sense that the convergence rate for the approximation, estimation and optimization errors does not deteriorate as dimensionality goes up? We address this question for three classes of machine learning models: The random feature model, two-layer neural networks and the residual neural network model. During the process, we will also summarize the current status of the theoretical foundation of deep learning, and discuss some of the key open questions.

Chat is not available.