Title
Person Re-Identification Via Recurrent Feature Aggregation
Abstract
We address the person re-identification problem by effectively exploiting a globally discriminative feature representation from a sequence of tracked human regions/patches. This is in contrast to previous person re-id works, which rely on either single frame based person to person patch matching, or graph based sequence to sequence matching. We show that a progressive/sequential fusion framework based on long short term memory (LSTM) network aggregates the frame-wise human region representation at each time stamp and yields a sequence level human feature representation. Since LSTM nodes can remember and propagate previously accumulated good features and forget newly input inferior ones, even with simple hand-crafted features, the proposed recurrent feature aggregation network (RFA-Net) is effective in generating highly discriminative sequence level human representations. Extensive experimental results on two person re-identification benchmarks demonstrate that the proposed method performs favorably against state-of-the-art person re-identification methods.
Year
DOI
Venue
2017
10.1007/978-3-319-46466-4_42
COMPUTER VISION - ECCV 2016, PT VI
Keywords
DocType
Volume
Person re-identification, Feature fusion, Long short term memory networks
Journal
9910
ISSN
Citations 
PageRank 
0302-9743
42
1.03
References 
Authors
24
6
Name
Order
Citations
PageRank
Yichao Yan1906.70
Bingbing Ni2142182.90
Zhichao Song3451.40
Chao Ma463725.28
Yan Yan578438.14
Xiaokang Yang63581238.09