Abstract | ||
---|---|---|
Latent Semantic Indexing (LSI) is a favorite feature extraction method used in text classification. Since when important global features for all the classes can be determined by LSI, important local features for small classes may be ignored, this leads to poor performance on these small classes. To solve this problem, a novel method based on Partial Least Square (PLS) analysis is proposed by integrating class information into the latent classification structure. Important features are extracted according to both their descriptive power of document contents as in LSI, and their capacity of discriminating classes. The extracted features are applied to several classification algorithms: SVM, kNN, C4.5 and SMO. Experiments on Reuters prove that the features extracted by our method outperform those extracted by LSI in all the cases. In particular, the gain obtained by our method is the most apparent on small classes. |
Year | DOI | Venue |
---|---|---|
2007 | 10.1145/1244002.1244187 | SAC |
Keywords | Field | DocType |
important local feature,latent semantic indexing,latent classification structure,novel method,important global feature,important feature,text classification,favorite feature extraction method,square analysis,classification algorithm,small class,dimensionality reduction,feature extraction | Least squares,Data mining,Latent semantic indexing,Dimensionality reduction,Pattern recognition,Computer science,Support vector machine,Feature extraction,Artificial intelligence,Statistical classification | Conference |
ISBN | Citations | PageRank |
1-59593-480-4 | 7 | 0.63 |
References | Authors | |
11 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Xue-qiang Zeng | 1 | 76 | 7.91 |
Mingwen Wang | 2 | 315 | 38.28 |
Jian-yun Nie | 3 | 3681 | 238.61 |