Abstract | ||
---|---|---|
The Conformal Predictions framework is a recent development in machine learning to associate reliable measures of confidence with results in classification and regression. This framework is founded on the principles of algorithmic randomness (Kolmogorov complexity), transductive inference and hypothesis testing. While the formulation of the framework guarantees validity, the efficiency of the framework depends greatly on the choice of the classifier and appropriate kernel functions or parameters. While this framework has extensive potential to be useful in several applications, the lack of efficiency can limit its usability. In this paper, we propose a novel kernel learning methodology to maximize efficiency in the CP framework. This method is validated using the k-Nearest Neighbors classifier on three different datasets, and our results show immense promise in applying this method to obtain efficient conformal predictors that can be practically useful. |
Year | DOI | Venue |
---|---|---|
2010 | 10.1109/ICMLA.2010.42 | ICMLA |
Keywords | Field | DocType |
regression analysis,learning artificial intelligence,computational complexity,hypothesis test,support vector machines,prediction algorithms,testing,kernel,kernel function,kernel method,zinc,k nearest neighbor,machine learning,hypothesis testing,regression,kernel methods,classification,transductive inference | Kernel (linear algebra),Transduction (machine learning),Kolmogorov complexity,Pattern recognition,Computer science,Support vector machine,Artificial intelligence,Kernel method,Statistical hypothesis testing,Machine learning,Maximization,Kernel (statistics) | Conference |
Citations | PageRank | References |
4 | 0.55 | 10 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Vineeth Nallure Balasubramanian | 1 | 265 | 36.44 |
Shayok Chakraborty | 2 | 137 | 17.47 |
Sethuraman Panchanathan | 3 | 1431 | 152.04 |
Jieping Ye | 4 | 6943 | 351.37 |