Title
Learning similarity for semantic images classification
Abstract
While people compare images using semantic concepts, computers compare images using low-level visual features that sometimes have little to do with these semantics. To reduce the gap between the high-level semantics of visual objects and the low-level features extracted from them, in this paper we develop a framework of learning similarity (LS) using neural networks for semantic image classification, where a LS-based k-nearest neighbors (k-NNL) classifier is employed to assign a label to an unknown image according to the majority of k most similar features. Experimental results on an image database show that the k-NNL classifier outperforms the Euclidean distance-based k-NN (k-NNE) classifier and back-propagation network classifiers (BPNC).
Year
DOI
Venue
2005
10.1016/j.neucom.2004.10.114
Neurocomputing
Keywords
Field
DocType
Neural networks,Learning similarity,Image classification,k-NN rule,Robustness
Visual Objects,Pattern recognition,Computer science,Euclidean distance,Robustness (computer science),Artificial intelligence,Image database,Contextual image classification,Artificial neural network,Classifier (linguistics),Semantics,Machine learning
Journal
Volume
ISSN
Citations 
67
0925-2312
6
PageRank 
References 
Authors
0.48
4
4
Name
Order
Citations
PageRank
Dianhui Wang1154793.41
Joon Shik Lim2516.39
Myung-Mook Han3134.64
Byungwook Lee410015.07