Title
Person re-identification based on multi-scale feature learning
Abstract
AbstractAbstractExtracting discriminative pedestrian features is an effective method in person re-identification. Most person re-identification works focus on extracting abstract features from the high-layer of the network, but ignore the middle-layer features, thus reducing the identity accuracy. To solve this problem, we construct a Smooth Aggregation Module (SAM) to extract, align, and fuse the feature maps in the middle-layer of the network to make up for the lack of detailed information in the high-level network features, and propose an Omni-Scale Feature Aggregation method (OSFA) Source codes are available at https://github.com/lyy973/OSFA.git. to jointly learn the abstract features and local detail features. Considering that the intra-class distance in person re-identification should be less than the inter-class distance, we combine multiple losses to constrain the model. We evaluate the performance of our method on three standard benchmark datasets: Market-1501, CUHK03 (both detected and labeled) and DukeMTMC-reID, and experimental results show that our method is superior to the state-of-the-art approaches.
Year
DOI
Venue
2021
10.1016/j.knosys.2021.107281
Periodicals
Keywords
DocType
Volume
Person re-identification, Multi-scale, Representation learning, Feature fusion
Journal
228
Issue
ISSN
Citations 
C
0950-7051
1
PageRank 
References 
Authors
0.35
0
4
Name
Order
Citations
PageRank
Yueying Li110.35
Li Liu216950.09
Lei Zhu385451.69
Huaxiang Zhang443656.32