Title
Extended analyses for an optimal kernel in a class of kernels with an invariant metric
Abstract
Learning based on kernel machines is widely known as a powerful tool for various fields of information science such as pattern recognition and regression estimation. An appropriate model selection is required in order to obtain desirable learning results. In our previous work, we discussed a class of kernels forming a nested class of reproducing kernel Hilbert spaces with an invariant metric and proved that the kernel corresponding to the smallest reproducing kernel Hilbert space, including an unknown true function, gives the best model. In this paper, we relax the invariant metric condition and show that a similar result is obtained when a subspace with an invariant metric exists.
Year
DOI
Venue
2012
10.1007/978-3-642-34166-3_38
SSPR/SPR
Keywords
Field
DocType
best model,desirable learning result,invariant metric condition,smallest reproducing kernel hilbert,kernel machine,invariant metric,nested class,appropriate model selection,reproducing kernel hilbert space,optimal kernel,extended analysis,information science,orthogonal projection
Applied mathematics,Topology,Kernel embedding of distributions,Kernel principal component analysis,Polynomial kernel,String kernel,Kernel method,Variable kernel density estimation,Mathematics,Reproducing kernel Hilbert space,Kernel (statistics)
Conference
Volume
ISSN
Citations 
7626
0302-9743
3
PageRank 
References 
Authors
0.43
9
4
Name
Order
Citations
PageRank
Akira Tanaka13812.20
Ichigaku Takigawa220918.15
Hideyuki Imai310325.08
Mineichi Kudo4927116.09