Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Duality Principles for Modern Machine Learning

A max-affine spline approximation of neural networks using the Legendre transform of a convex-concave representation

Adam Perrett · Danny Wood · Gavin Brown

Keywords: [ inspection ] [ convex ] [ concave ] [ features ] [ legendre ] [ mnist ] [ spline ] [ duality ] [ Neural Networks ]


Abstract:

This work presents a novel algorithm for transforming a neural network into a spline representation. Unlike previous work that required convex and piecewise-affine network operators to create a max-affine spline alternate form, this work relaxes this constraint. The only constraint is that the function be bounded and possess a well-define second derivative, although this was shown experimentally to not be strictly necessary. It can also be performed over the whole network rather than on each layer independently. As in previous work, this bridges the gap between neural networks and approximation theory but also enables the visualisation of network feature maps. Mathematical proof and experimental investigation of the technique is performed with approximation error and feature maps being extracted from a range of architectures, including convolutional neural networks.

Chat is not available.