Timezone: »

 
Talk
Neural networks and rational functions
Matus Telgarsky

Mon Aug 07 06:24 PM -- 06:42 PM (PDT) @ C4.8
Neural networks and rational functions efficiently approximate each other. In more detail, it is shown here that for any ReLU network, there exists a rational function of degree $O(polylog(1/\epsilon))$ which is $\epsilon$-close, and similarly for any rational function there exists a ReLU network of size $O(polylog(1/\epsilon))$ which is $\epsilon$-close. By contrast, polynomials need degree $\Omega(poly(1/\epsilon))$ to approximate even a single ReLU. When converting a ReLU network to a rational function as above, the hidden constants depend exponentially on the number of layers, which is shown to be tight; in other words, a compositional representation can be beneficial even for rational functions.

Author Information

Matus Telgarsky (UIUC)

Related Events (a corresponding poster, oral, or spotlight)