Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Dynamic Neural Networks

Parameter efficient dendritic-tree neurons outperform perceptrons

Ziwen Han · Evgeniya Gorobets · Pan Chen


Abstract:

Biological neurons are more powerful than thanartificial perceptrons, in part due to complex den-dritic input computations. Inspired to empowerthe perceptron with biologically inspired features,we explore the effect of adding and tuning inputbranching factors along with input dropout. Thisallows for parameter efficient non-linear input ar-chitectures to be discovered and benchmarked.Furthermore, we developed an encapsulated Py-Torch module to tune and replace multi-layer per-ceptron layers in existing architectures. Our ini-tial experiments on MNIST classification demon-strate the accuracy and generalization improve-ment of artificial neurons with dendritic featurescompared to existing perceptron architectures.

Chat is not available.