Title
Differentiable neural architecture learning for efficient neural networks
Abstract
•We build a new standalone control module based on the scaled sigmoid function to enrich the neural network module family to enable the neural architecture optimization.•Our DNAL method produces no candidate neural architectures but one, thus drastically improving the learning efficiency, i.e., costing 20 epochs for CIFAR-10 and 10 epochs for ImageNet.•It is applicable to conventional CNNs, lightweight CNNs, and stochastic supernets.•Extensive experiments confirm that our DNAL method achieves excellent performance on various CNN architectures, including VGG16, ResNet50, MobileNetV2, and ProxylessNAS, over the task of CIFAR-10 and ImageNet-1K classification.
Year
DOI
Venue
2022
10.1016/j.patcog.2021.108448
Pattern Recognition
Keywords
DocType
Volume
Deep neural network,Convolutional neural network,Neural architecture search,Automated machine learning
Journal
126
ISSN
Citations 
PageRank 
0031-3203
0
0.34
References 
Authors
0
4
Name
Order
Citations
PageRank
Qingbei Guo111.45
Xiaojun Wu235652.89
J. Kittler3143461465.03
Zhiquan Feng43613.73