Title
Direct parallel perceptrons (DPPs): fast analytical calculation of the parallel perceptrons weights with margin control for classification tasks.
Abstract
Parallel perceptrons (PPs) are very simple and efficient committee machines (a single layer of perceptrons with threshold activation functions and binary outputs, and a majority voting decision scheme), which nevertheless behave as universal approximators. The parallel delta (P-Delta) rule is an effective training algorithm, which, following the ideas of statistical learning theory used by the support vector machine (SVM), raises its generalization ability by maximizing the difference between the perceptron activations for the training patterns and the activation threshold (which corresponds to the separating hyperplane). In this paper, we propose an analytical closed-form expression to calculate the PPs' weights for classification tasks. Our method, called Direct Parallel Perceptrons (DPPs), directly calculates (without iterations) the weights using the training patterns and their desired outputs, without any search or numeric function optimization. The calculated weights globally minimize an error function which simultaneously takes into account the training error and the classification margin. Given its analytical and noniterative nature, DPPs are computationally much more efficient than other related approaches (P-Delta and SVM), and its computational complexity is linear in the input dimensionality. Therefore, DPPs are very appealing, in terms of time complexity and memory consumption, and are very easy to use for high-dimensional classification tasks. On real benchmark datasets with two and multiple classes, DPPs are competitive with SVM and other approaches but they also allow online learning and, as opposed to most of them, have no tunable parameters.
Year
DOI
Venue
2011
10.1109/TNN.2011.2169086
IEEE Transactions on Neural Networks
Keywords
Field
DocType
classification tasks,training error,effective training algorithm,efficient committee machine,training pattern,computational complexity,fast analytical calculation,direct parallel perceptrons,parallel perceptrons weights,activation threshold,classification margin,margin control,high-dimensional classification task,classification task,analytical closed-form expression,closed form solution,activation function,statistical analysis,linear approximation,support vector machine,perceptrons,majority voting,learning artificial intelligence,support vector machines,kernel,accuracy,time complexity,approximation theory
Statistical learning theory,Kernel (linear algebra),Error function,Pattern recognition,Computer science,Support vector machine,Artificial intelligence,Hyperplane,Time complexity,Perceptron,Machine learning,Computational complexity theory
Journal
Volume
Issue
ISSN
22
11
1941-0093
Citations 
PageRank 
References 
8
0.81
24
Authors
4
Name
Order
Citations
PageRank
Manuel Fernandez-Delgado1191.47
Jorge Ribeiro24812.98
Eva Cernadas339720.30
Senén Barro462048.65