Title
Weight-Sharing Neural Architecture Search: A Battle to Shrink the Optimization Gap
Abstract
AbstractNeural architecture search (NAS) has attracted increasing attention. In recent years, individual search methods have been replaced by weight-sharing search methods for higher search efficiency, but the latter methods often suffer lower instability. This article provides a literature review on these methods and owes this issue to the optimization gap. From this perspective, we summarize existing approaches into several categories according to their efforts in bridging the gap, and we analyze both advantages and disadvantages of these methodologies. Finally, we share our opinions on the future directions of NAS and AutoML. Due to the expertise of the authors, this article mainly focuses on the application of NAS to computer vision problems.
Year
DOI
Venue
2022
10.1145/3473330
ACM Computing Surveys
Keywords
DocType
Volume
AutoML, neural architecture search, weight-sharing, super-network, optimization gap, computer vision
Journal
54
Issue
ISSN
Citations 
9
0360-0300
0
PageRank 
References 
Authors
0.34
107
11
Search Limit
100107
Name
Order
Citations
PageRank
Ling-Xi Xie142937.79
Xin Chen200.34
Kaifeng Bi300.34
Longhui Wei41407.98
Yuhui Xu5125.00
Zhengsu Chen600.34
Lanfei Wang700.34
An Xiao800.34
Jianlong Chang900.34
Xiaopeng Zhang1029333.34
Qi Tian116443331.75