Processing math: 100%
Skip to yearly menu bar Skip to main content


Poster

Minimum Width of Leaky-ReLU Neural Networks for Uniform Universal Approximation

Li'ang Li · Yifei duan · Guanghua Ji · Yongqiang Cai

Exhibit Hall 1 #436
[ ]
[ Slides [ PDF [ Poster

Abstract: The study of universal approximation properties (UAP) for neural networks (NN) has a long history. When the network width is unlimited, only a single hidden layer is sufficient for UAP. In contrast, when the depth is unlimited, the width for UAP needs to be not less than the critical width wmin=max(dx,dy), where dx and dy are the dimensions of the input and output, respectively. Recently, (Cai, 2022) shows that a leaky-ReLU NN with this critical width can achieve UAP for Lp functions on a compact domain K, *i.e.,* the UAP for Lp(K,Rdy). This paper examines a uniform UAP for the function class C(K,Rdy) and gives the exact minimum width of the leaky-ReLU NN as wmin=max(dx+1,dy)+1dy=dx+1, which involves the effects of the output dimensions. To obtain this result, we propose a novel lift-flow-discretization approach that shows that the uniform UAP has a deep connection with topological theory.

Chat is not available.