Title
Learning With Mean-Variance Filtering, Svm And Gradient-Based Optimization
Abstract
We consider several models, which employ gradient-based method as a core optimization tool. Experimental results were obtained in a real time environment during WCCI-2006 Performance Prediction Challenge. None of the models were proved to be absolutely best against all five datasets. Furthermore, we can exploit the actual difference between different models and create an ensemble system as a complex of the base models where the balances may be regulated using special parameters or confidence levels.Overfitting is a usual problem in the situation when dimension is comparable with the sample size or even higher. Using mean-variance filtering we can reduce the difference between training and test results significantly considering some features as a noise.
Year
DOI
Venue
2006
10.1109/IJCNN.2006.247013
2006 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORK PROCEEDINGS, VOLS 1-10
Keywords
Field
DocType
confidence level,learning artificial intelligence,svm,support vector machines,classification,factor analysis,sample size,projection pursuit,gradient,reduction,real time,linear transformation,principal component
Pattern recognition,Computer science,Support vector machine,Filter (signal processing),Exploit,Artificial intelligence,Overfitting,Filtering theory,Performance prediction,Sample size determination,Machine learning
Conference
ISSN
Citations 
PageRank 
2161-4393
4
0.60
References 
Authors
9
1
Name
Order
Citations
PageRank
Vladimir Nikulin19917.28