Title
Multiclass multiple kernel learning
Abstract
In many applications it is desirable to learn from several kernels. \Multiple kernel learn- ing" (MKL) allows the practitioner to opti- mize over linear combinations of kernels. By enforcing sparse coecien ts, it also general- izes feature selection to kernel selection. We propose MKL for joint feature maps. This provides a convenient and principled way for MKL with multiclass problems. In addition, we can exploit the joint feature map to learn kernels on output spaces. We show the equiv- alence of several dieren t primal formulations including dieren t regularizers. We present several optimization methods, and compare a convex quadratically constrained quadratic program (QCQP) and two semi-innite linear programs (SILPs) on toy data, showing that the SILPs are faster than the QCQP. We then demonstrate the utility of our method by ap- plying the SILP to three real world datasets.
Year
DOI
Venue
2007
10.1145/1273496.1273646
ICML
Keywords
Field
DocType
linear combination,different regularizers,joint feature map,multiple kernel learning,generalizes feature selection,multiclass problem,different primal formulation,semi-infinite linear program,kernel selection,multiclass multiple kernel learning,convex quadratically,feature selection,quadratically constrained quadratic program,linear program
Kernel (linear algebra),Linear combination,Feature selection,Quadratically constrained quadratic program,Pattern recognition,Computer science,Multiple kernel learning,Exploit,Regular polygon,Equivalence (measure theory),Artificial intelligence,Machine learning
Conference
Citations 
PageRank 
References 
113
5.57
17
Authors
2
Search Limit
100113
Name
Order
Citations
PageRank
Alexander Zien11255146.93
Cheng Soon Ong2123286.27