资源论文Improved Expressivity Through Dendritic Neural Networks

Improved Expressivity Through Dendritic Neural Networks

2020-02-18 | |  57 |   48 |   0

Abstract 

A typical biological neuron, such as a pyramidal neuron of the neocortex, receives thousands ofafferent synaptic inputs on its dendrite tree and sends the efferent axonal output downstream. Intypical artificial neural networks, dendrite trees are modeled as linear structures that funnel weightedsynaptic inputs to the cell bodies. However, numerous experimental and theoretical studies haveshown that dendritic arbors are far more than simple linear accumulators. That is, synaptic inputs canactively modulate their neighboring synaptic activities; therefore, the dendritic structures are highlynonlinear. In this study, we model such local nonlinearity of dendritic trees with our dendritic neuralnetwork (DENN) structure and apply this structure to typical machine learning tasks. Equipped withlocalized nonlinearities, DENNs can attain greater model expressivity than regular neural networkswhile maintaining efficient network inference. Such strength is evidenced by the increased fittingpower when we train DENNs with supervised machine learning tasks. We also empirically showthat the locality structure of DENNs can improve the generalization performance, as exemplified byDENNs outranking naive deep neural network architectures when tested on classification tasks fromthe UCI machine learning repository.

上一篇:Gradient Descent for Spiking Neural Networks

下一篇:Efficient Formal Safety Analysis of Neural Networks

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...