Abstract | ||
---|---|---|
We propose a novel algorithm called Backpropagation Neural Tree (BNeuralT), which is a stochastic computational dendritic tree. BNeuralT takes random repeated inputs through its leaves and imposes dendritic nonlinearities through its internal connections like a biological dendritic tree would do. Considering the dendritic-tree like plausible biological properties, BNeuralT is a single neuron neural tree model with its internal sub-trees resembling dendritic nonlinearities. BNeuralT algorithm produces an ad hoc neural tree which is trained using a stochastic gradient descent optimizer like gradient descent (GD), momentum GD, Nesterov accelerated GD, Adagrad, RMSprop, or Adam. BNeuralT training has two phases, each computed in a depth-first search manner: the forward pass computes neural tree’s output in a post-order traversal, while the error backpropagation during the backward pass is performed recursively in a pre-order traversal. A BNeuralT model can be considered a minimal subset of a neural network (NN), meaning it is a “thinned” NN whose complexity is lower than an ordinary NN. Our algorithm produces high-performing and parsimonious models balancing the complexity with descriptive ability on a wide variety of machine learning problems: classification, regression, and pattern recognition. |
Year | DOI | Venue |
---|---|---|
2022 | 10.1016/j.neunet.2022.02.003 | Neural Networks |
Keywords | DocType | Volume |
Stochastic gradient descent,RMSprop,Backpropagation,Minimal architecture,Neural networks,Neural trees | Journal | 149 |
Issue | ISSN | Citations |
1 | 0893-6080 | 0 |
PageRank | References | Authors |
0.34 | 0 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Varun Kumar Ojha | 1 | 32 | 9.25 |
Giuseppe Nicosia | 2 | 0 | 1.69 |