Timezone: »

On the Spectral Bias of Neural Networks
Nasim Rahaman · Aristide Baratin · Devansh Arpit · Felix Draxler · Min Lin · Fred Hamprecht · Yoshua Bengio · Aaron Courville

Thu Jun 13 06:30 PM -- 09:00 PM (PDT) @ Pacific Ballroom #72

Neural networks are known to be a class of highly expressive functions able to fit even random input-output mappings with 100% accuracy. In this work we present properties of neural networks that complement this aspect of expressivity. By using tools from Fourier analysis, we highlight a learning bias of deep networks towards low frequency functions -- i.e. functions that vary globally without local fluctuations -- which manifests itself as a frequency-dependent learning speed. Intuitively, this property is in line with the observation that over-parameterized networks prioritize learning simple patterns that generalize across data samples. We also investigate the role of the shape of the data manifold by presenting empirical and theoretical evidence that, somewhat counter-intuitively, learning higher frequencies gets easier with increasing manifold complexity.

Author Information

Nasim Rahaman (University of Heidelberg)
Aristide Baratin (MILA)
Devansh Arpit (Montréal Institute for Learning Algorithms, Canada)
Felix Draxler (Heidelberg University)
Min Lin (University of Montreal)
Fred Hamprecht (Heidelberg Collaboratory for Image Processing)
Yoshua Bengio (Mila / U. Montreal)

Yoshua Bengio is recognized as one of the world’s leading experts in artificial intelligence and a pioneer in deep learning. Since 1993, he has been a professor in the Department of Computer Science and Operational Research at the Université de Montréal. He is the founder and scientific director of Mila, the Quebec Institute of Artificial Intelligence, the world’s largest university-based research group in deep learning. He is a member of the NeurIPS board and co-founder and general chair for the ICLR conference, as well as program director of the CIFAR program on Learning in Machines and Brains and is Fellow of the same institution. In 2018, Yoshua Bengio ranked as the computer scientist with the most new citations, worldwide, thanks to his many publications. In 2019, he received the ACM A.M. Turing Award, “the Nobel Prize of Computing”, jointly with Geoffrey Hinton and Yann LeCun for conceptual and engineering breakthroughs that have made deep neural networks a critical component of computing. In 2020 he was nominated Fellow of the Royal Society of London.

Aaron Courville (University of Montreal)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors