Abstract | ||
---|---|---|
Kernel-based feature combination techniques such as Multiple Kernel Learning use arithmetical operations to linearly combine different kernels. We have observed that the kernel distributions of different features are usually very different. We argue that the similarity distributions amongst the data points for a given dataset should not change with their representation features and propose the concept of relative kernel distribution invariance (RKDI). We have developed a very simple histogram matching based technique to achieve RKDI by transforming the kernels to a canonical distribution. We have performed extensive experiments on various computer vision and machine learning datasets and show that calibrating the kernels to an empirically chosen canonical space before they are combined can always achieve a performance gain over state-of-art methods. As histogram matching is a remarkably simple and robust technique, the new method is universally applicable to kernel-based feature combination. |
Year | DOI | Venue |
---|---|---|
2011 | 10.5244/C.25.58 | PROCEEDINGS OF THE BRITISH MACHINE VISION CONFERENCE 2011 |
Field | DocType | Citations |
Radial basis function kernel,Pattern recognition,Kernel embedding of distributions,Computer science,Multiple kernel learning,Kernel principal component analysis,Tree kernel,Artificial intelligence,String kernel,Kernel method,Variable kernel density estimation | Conference | 2 |
PageRank | References | Authors |
0.39 | 18 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Hao Fu | 1 | 41 | 16.96 |
Guoping Qiu | 2 | 1306 | 117.19 |
Hangen He | 3 | 307 | 23.86 |