Ojha, V. ORCID: https://orcid.org/0000-0002-9256-1192 and Nicosia, G.
(2022)
Backpropagation neural tree.
Neural Networks, 149.
ISSN 0893-6080
doi: 10.1016/j.neunet.2022.02.003
Abstract/Summary
We propose a novel algorithm called Backpropagation Neural Tree (BNeuralT), which is a stochastic computational dendritic tree. BNeuralT takes random repeated inputs through its leaves and imposes dendritic nonlinearities through its internal connections like a biological dendritic tree would do. Considering the dendritic-tree like plausible biological properties, BNeuralT is a single neuron neural tree model with its internal sub-trees resembling dendritic nonlinearities. BNeuralT algorithm produces an ad hoc neural tree which is trained using a stochastic gradient descent optimizer like gradient descent (GD), momentum GD, Nesterov accelerated GD, Adagrad, RMSprop, or Adam. BNeuralT training has two phases, each computed in a depth-first search manner: the forward pass computes neural tree's output in a post-order traversal, while the error backpropagation during the backward pass is performed recursively in a pre-order traversal. A BNeuralT model can be considered a minimal subset of a neural network (NN), meaning it is a "thinned" NN whose complexity is lower than an ordinary NN. Our algorithm produces high-performing and parsimonious models balancing the complexity with descriptive ability on a wide variety of machine learning problems: classification, regression, and pattern recognition.
Altmetric Badge
Item Type | Article |
URI | https://reading-clone.eprints-hosting.org/id/eprint/102926 |
Item Type | Article |
Refereed | Yes |
Divisions | Interdisciplinary Research Centres (IDRCs) > Centre for the Mathematics of Planet Earth (CMPE) Science > School of Mathematical, Physical and Computational Sciences > Department of Computer Science |
Publisher | Elsevier |
Download/View statistics | View download statistics for this item |
Downloads
Downloads per month over past year
University Staff: Request a correction | Centaur Editors: Update this record