Title
Multi-label Multiple Kernel Learning
Abstract
We present a multi-label multiple kernel learning (MKL) formulation in which the data are embedded into a low-dimensional space directed by the instance- label correlations encoded into a hypergraph. We formulate the problem in the kernel-induced feature space and propose to learn the kernel matrix as a linear combination of a given collection of kernel matrices in the MKL framework. The proposed learning formulation leads to a non-smooth min-max problem, which can be cast into a semi-infinite linear program (SILP). We further propose an ap- proximate formulation with a guaranteed error bound which involves an uncon- strained convex optimization problem. In addition, we show that the objective function of the approximate formulation is differentiable with Lipschitz continu- ous gradient, and hence existing methods can be employed to compute the optimal solution efficiently. We apply the proposed formulation to the automated annota- tion of Drosophila gene expression pattern images, and promising results have been reported in comparison with representative algorithms.
Year
Venue
Keywords
2008
NIPS
convex optimization,objective function,feature space,lipschitz continuity,linear program
Field
DocType
Citations 
Kernel (linear algebra),Mathematical optimization,Radial basis function kernel,Kernel embedding of distributions,Computer science,Multiple kernel learning,Polynomial kernel,Artificial intelligence,Kernel method,String kernel,Variable kernel density estimation,Machine learning
Conference
32
PageRank 
References 
Authors
1.80
12
4
Name
Order
Citations
PageRank
Shuiwang Ji12579122.25
Liang Sun250024.61
Rong Jin36206334.26
Jieping Ye46943351.37