Abstract | ||
---|---|---|
In the field of image recognition, a high-dimensional feature vector is often used to construct a classifier. This presents a problem, however, since using a large number of features can slow down training and degrade model readability. To alleviate this problem, sequential backward selection (SBS) has come to be used as a method for selecting an effective number of features for classification. However, as a type of wrapper method, SBS iteratively constructs and evaluates classifiers when selecting features, which is computationally intensive. In this study, we define the contribution ratio of features by random forest and use it to create an efficient feature selection method. We performed an evaluation experiment to compare the proposed method with SBS and found that the former could significantly reduce feature selection time for the same dimension reduction rate. |
Year | DOI | Venue |
---|---|---|
2015 | 10.1109/FCV.2015.7103746 | FCV |
Field | DocType | ISSN |
Data mining,Decision tree,Feature vector,Dimensionality reduction,Feature selection,Pattern recognition,Feature (computer vision),Computer science,Artificial intelligence,Classifier (linguistics),Random forest | Conference | 2165-1051 |
Citations | PageRank | References |
1 | 0.35 | 4 |
Authors | ||
5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Ryuei Murata | 1 | 1 | 0.35 |
Yohei Mishina | 2 | 1 | 0.35 |
Yuji Yamauchi | 3 | 43 | 10.45 |
Takayoshi Yamashita | 4 | 377 | 46.83 |
fujiyoshi | 5 | 730 | 101.43 |