Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Dynamic Neural Networks

Connectivity Properties of Neural Networks Under Performance-Resources Trade-off

Aleksandra I. Nowak · Romuald A. Janik


Abstract:

We analyze the structure of network architectures obtained when trained under a performance-resources trade-off for various datasets.To this end, we use a flexible setup allowing for a neural network to learn both its size and topology during the course of a standard gradient-based training. The resulting network has the structure of a graph tailored to the particular learning task and dataset. We explore the properties of the resulting network architectures for a number of datasets of varying difficulty observing systematic regularities. The obtained graphs can be therefore understood as encoding nontrivial characteristics of the particular classification tasks.

Chat is not available.