Abstract | ||
---|---|---|
Learning a classifier when only knowing about the features and marginal distribution of class labels in each of the data groups is both theoretically interesting and practically useful. Specifically, we are interested in the case where the ratio of the number of data instances to the number of classes is large. For this problem, we show that the performance of a previously proposed discriminative classifier will deteriorate quickly as the ratio grows. In contrast, we formulate a density estimation framework to learn a generative classifier by RBM in this scenario with guaranteed performance under mild assumption. |
Year | DOI | Venue |
---|---|---|
2012 | 10.1007/978-3-642-36669-7_76 | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) |
Keywords | Field | DocType |
marginal distribution,label proportion,mild assumption,generative classifier,discriminative classifier,large group case,class label,density estimation framework,guaranteed performance,data instance,data group,density estimation | Density estimation,Pattern recognition,Artificial intelligence,Generative grammar,Classifier (linguistics),Discriminative model,Marginal distribution,Mathematics,Machine learning | Conference |
Volume | Issue | ISSN |
7751 LNCS | null | 16113349 |
Citations | PageRank | References |
0 | 0.34 | 9 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Fan, Kai | 1 | 69 | 7.98 |
Hongyi Zhang | 2 | 6 | 0.83 |
Yu Zang | 3 | 74 | 9.22 |
Liwei Wang | 4 | 1272 | 88.14 |