Title
Memory-Efficient Models for Scene Text Recognition via Neural Architecture Search
Abstract
Meta-learning techniques based on neural architecture search (NAS) show excellent performance in the design of learning models used in deep neural networks. In particular, when NAS is applied to design a convolutional neural network (CNN) for image recognition, the performance of the network when evaluating public benchmark datasets such as CIFAR10 and ImageNet exceeds that of hand-designed models. Nevertheless, there are very few cases wherein NAS has been applied to real-world problems, i.e. recognition problems with a limited dataset. We proposed a method in which the NAS technique does not require a proxy task for the scene text recognition (STR) framework to apply the NAS method to a new image recognition field. Therefore, we proposed an architecture space for CNN-based modules in the STR framework and applied the ProxylessNAS method, enabling end-to-end training while meta learners design a new model that requires only a single commonly used GPU (approximately 100 GPU hours). To evaluate the STR model obtained by the proposed NAS method, seven STR benchmark datasets were used. Finally, the obtained model could achieve a performance similar to that of the ideal model in terms of accuracy and number of parameters. We thus confirm that the model design based on NAS can be effectively applied to STR scenarios.
Year
DOI
Venue
2020
10.1109/WACVW50321.2020.9096928
2020 IEEE Winter Applications of Computer Vision Workshops (WACVW)
Keywords
DocType
ISSN
memory-efficient models,neural architecture search,meta-learning techniques,learning models,deep neural networks,convolutional neural network,public benchmark datasets,hand-designed models,NAS technique,scene text recognition framework,image recognition field,architecture space,CNN-based modules,ProxylessNAS method,end-to-end training,meta learners design,ideal model,model design,STR benchmark datasets
Conference
2572-4398
ISBN
Citations 
PageRank 
978-1-7281-7163-0
0
0.34
References 
Authors
7
3
Name
Order
Citations
PageRank
SeulGi Hong100.34
Donghyun Kim200.34
Min-Kook Choi300.34