Title
Feature Selection For Mutlti-Labeled Variables Via Dependency Maximization
Abstract
Feature selection and reducing the dimensionality of data is an essential step in data analysis. In this work we propose a new criterion for feature selection that is formulated as conditional information between features given the labeled variable. Instead of using the standard mutual information measure based on Kullback-Leibler divergence, we use our proposed criterion to filter out redundant features for the purpose of multiclass classification. This approach results in an efficient and fast non-parametric implementation of feature selection as it can be directly estimated using a geometric measure of dependency, the global Friedman-Rafsky (FR) multivariate run test statistic constructed by a global minimal spanning tree (MST). We demonstrate the advantages of our proposed feature selection approach through simulation. In addition the proposed feature selection method is applied to the MNIST data set.
Year
DOI
Venue
2019
10.1109/ICASSP.2019.8682529
2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP)
Keywords
Field
DocType
Feature selection, conditional mutual information, geometric nonparametric measure, global minimal spanning tree, Friedman-Rafsky test statistic
MNIST database,Test statistic,Pattern recognition,Feature selection,Computer science,Feature extraction,Curse of dimensionality,Mutual information,Artificial intelligence,Multiclass classification,Minimum spanning tree
Conference
Volume
ISSN
Citations 
abs/1902.03544
1520-6149
0
PageRank 
References 
Authors
0.34
0
3
Name
Order
Citations
PageRank
Sekeh Salimeh Yasaei100.34
Alfred O. Hero III22600301.12
Iii Alfred O. Hero31713197.61