Title
Fewer-Shots and Lower-Resolutions: Towards Ultrafast Face Recognition in the Wild
Abstract
Is it possible to train an effective face recognition model with fewer shots that works efficiently on low-resolution faces in the wild? To answer this question, this paper proposes a few-shot knowledge distillation approach to learn an ultrafast face recognizer via two steps. In the first step, we initialize a simple yet effective face recognition model on synthetic low-resolution faces by distilling knowledge from an existing complex model. By removing the redundancies in both face images and the model structure, the initial model can provide an ultrafast speed with impressive recognition accuracy. To further adapt this model into the wild scenarios with fewer faces per person, the second step refines the model via few-shot learning by incorporating a relation module that compares low-resolution query faces with faces in the support set. In this manner, the performance of the model can be further enhanced with only fewer low-resolution faces in the wild. Experimental results show that the proposed approach performs favorably against state-of-the-arts in recognizing low-resolution faces with an extremely low memory of 30KB and runs at an ultrafast speed of 1,460 faces per second on CPU or 21,598 faces per second on GPU.
Year
DOI
Venue
2019
10.1145/3343031.3351082
Proceedings of the 27th ACM International Conference on Multimedia
Keywords
Field
DocType
face recognition, few-shot learning, knowledge distillation
Computer vision,Facial recognition system,Computer science,Artificial intelligence,Ultrashort pulse
Conference
ISBN
Citations 
PageRank 
978-1-4503-6889-6
1
0.35
References 
Authors
0
4
Name
Order
Citations
PageRank
Shiming Ge110624.60
Shengwei Zhao2314.67
Xindi Gao331.39
Jia Li452442.09