Timezone: »
Talk
Neural networks and rational functions
Matus Telgarsky
Neural networks and rational functions efficiently approximate each other. In more detail, it is shown here that for any ReLU network, there exists a rational function of degree $O(polylog(1/\epsilon))$ which is $\epsilon$-close, and similarly for any rational function there exists a ReLU network of size $O(polylog(1/\epsilon))$ which is $\epsilon$-close. By contrast, polynomials need degree $\Omega(poly(1/\epsilon))$ to approximate even a single ReLU. When converting a ReLU network to a rational function as above, the hidden constants depend exponentially on the number of layers, which is shown to be tight; in other words, a compositional representation can be beneficial even for rational functions.
Author Information
Matus Telgarsky (UIUC)
Related Events (a corresponding poster, oral, or spotlight)
-
2017 Poster: Neural networks and rational functions »
Wed. Aug 9th 08:30 AM -- 12:00 PM Room Gallery #4