Title
Deep Feature Matching For Dense Correspondence
Abstract
Image matching is a challenging problem as different views often undergo significant appearance changes caused by deformation, abrupt motion, and occlusion. In this paper, we explore features extracted from convolutional neural networks to help the estimation of image matching so that dense pixel correspondence can be built. As the deep features are able to describe the image structures, the matching method based on these features is able to match across different scenes and/or object appearances. We analyze the deep features and compare them with other robust features, e.g., SIFT. Extensive experiments on 5 datasets demonstrate the proposed algorithm performs favorably against the state-of-the-art methods in terms of visually matching quality and accuracy.
Year
Venue
Keywords
2017
2017 24TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP)
Deep feature, dense correspondence, scene matching, optical flow, handcrafted feature
Field
DocType
ISSN
Computer vision,Scale-invariant feature transform,Pattern recognition,Image matching,Computer science,Convolutional neural network,Feature extraction,Robustness (computer science),Feature matching,Artificial intelligence,Pixel,Optical imaging
Conference
1522-4880
Citations 
PageRank 
References 
0
0.34
0
Authors
3
Name
Order
Citations
PageRank
Yang Liu110.70
Jin-shan Pan256730.84
Zhixun Su3416.61