Title
Dissimilarity Metric Learning in the Belief Function Framework.
Abstract
The evidential K-nearest-neighbor (EK-NN) method provided a global treatment of imperfect knowledge regarding the class membership of training patterns. It has outperformed traditional K-NN rules in many applications, but still shares some of their basic limitations, e.g., 1) classification accuracy depends heavily on how to quantify the dissimilarity between different patterns and 2) no guarantee for satisfactory performance when training patterns contain unreliable (imprecise and/or uncertain) input features. In this paper, we propose to address these issues by learning a suitable metric, using a low-dimensional transformation of the input space, so as to maximize both the accuracy and efficiency of the EK-NN classification. To this end, a novel loss function to learn the dissimilarity metric is constructed. It consists of two terms: the first one quantifies the imprecision regarding the class membership of each training pattern, while, by means of feature selection, the second one controls the influence of unreliable input features on the output linear transformation. The proposed method has been compared with some other metric learning methods on several synthetic and real datasets. It consistently led to comparable performance with regard to testing accuracy and class structure visualization.
Year
DOI
Venue
2016
10.1109/TFUZZ.2016.2540068
IEEE Trans. Fuzzy Systems
Keywords
Field
DocType
Measurement,Training,Cost function,Cognition,Supervised learning,Probability distribution
Feature transformation,Data mining,Dimensionality reduction,Imperfect,Feature selection,Visualization,Supervised learning,Probability distribution,Artificial intelligence,Linear map,Machine learning,Mathematics
Journal
Volume
Issue
ISSN
24
6
1063-6706
Citations 
PageRank 
References 
6
0.43
23
Authors
3
Name
Order
Citations
PageRank
Chunfeng Lian113222.61
Ruan Su255953.00
Thierry Denoeux381574.98