Title
An approximate approach for training polynomial kernel SVMs in linear time
Abstract
Kernel methods such as support vector machines (SVMs) have attracted a great deal of popularity in the machine learning and natural language processing (NLP) communities. Polynomial kernel SVMs showed very competitive accuracy in many NLP problems, like part-of-speech tagging and chunking. However, these methods are usually too inefficient to be applied to large dataset and real time purpose. In this paper, we propose an approximate method to analogy polynomial kernel with efficient data mining approaches. To prevent exponential-scaled testing time complexity, we also present a new method for speeding up SVM classifying which does independent to the polynomial degree d. The experimental results showed that our method is 16.94 and 450 times faster than traditional polynomial kernel in terms of training and testing respectively.
Year
Venue
Keywords
2007
ACL
polynomial degree,training polynomial kernel svms,approximate method,nlp problem,real time purpose,approximate approach,polynomial kernel svms,linear time,new method,kernel method,polynomial kernel,exponential-scaled testing time complexity,traditional polynomial kernel,time complexity,natural language processing,data mining,real time,machine learning,support vector
Field
DocType
Volume
Radial basis function kernel,Least squares support vector machine,Pattern recognition,Computer science,Kernel embedding of distributions,Support vector machine,Tree kernel,Polynomial kernel,Artificial intelligence,String kernel,Kernel method,Machine learning
Conference
P07-2
Citations 
PageRank 
References 
10
0.80
8
Authors
3
Name
Order
Citations
PageRank
Yu-Chieh Wu124723.16
Jie-Chi Yang235043.91
Yue-Shi Lee354341.14