Title
Feature Combiners With Gate-Generated Weights for Classification
Abstract
Using functional weights in a conventional linear combination architecture is a way of obtaining expressive power and represents an alternative to classical trainable and implicit nonlinear transformations. In this brief, we explore this way of constructing binary classifiers, taking advantage of the possibility of generating functional weights by means of a gate with fixed radial basis functions. This particular form of the gate permits training the machine directly with maximal margin algorithms. We call the resulting scheme “feature combiners with gate generated weights for classification.” Experimental results show that these architectures outperform support vector machines (SVMs) and Real AdaBoost ensembles in most considered benchmark examples. An increase in the computational design effort due to cross-validation demands is the price to be paid to obtain this advantage. Nevertheless, the operational effort is usually lower than that needed by SVMs.
Year
DOI
Venue
2013
10.1109/TNNLS.2012.2223232
Neural Networks and Learning Systems, IEEE Transactions
Keywords
Field
DocType
learning (artificial intelligence),pattern classification,radial basis function networks,support vector machines,SVM,binary classifiers,classification,conventional linear combination architecture,feature combiners,functional weights,gate-generated weights,implicit nonlinear transformations,maximal margin algorithms,radial basis functions,real AdaBoost ensembles,support vector machines,Functional weights,gate fusion,maximal margin,neural networks
Linear combination,Logic gate,Algorithm design,Radial basis function,AdaBoost,Pattern recognition,Computer science,Support vector machine,Artificial intelligence,Artificial neural network,Machine learning,Binary number
Journal
Volume
Issue
ISSN
24
1
2162-237X
Citations 
PageRank 
References 
10
0.52
20
Authors
2
Name
Order
Citations
PageRank
Adil Omari1100.52
Figueiras-Vidal, A.R.229540.59