Abstract | ||
---|---|---|
AbstractNeural architecture search (NAS) has attracted increasing attention. In recent years, individual search methods have been replaced by weight-sharing search methods for higher search efficiency, but the latter methods often suffer lower instability. This article provides a literature review on these methods and owes this issue to the optimization gap. From this perspective, we summarize existing approaches into several categories according to their efforts in bridging the gap, and we analyze both advantages and disadvantages of these methodologies. Finally, we share our opinions on the future directions of NAS and AutoML. Due to the expertise of the authors, this article mainly focuses on the application of NAS to computer vision problems. |
Year | DOI | Venue |
---|---|---|
2022 | 10.1145/3473330 | ACM Computing Surveys |
Keywords | DocType | Volume |
AutoML, neural architecture search, weight-sharing, super-network, optimization gap, computer vision | Journal | 54 |
Issue | ISSN | Citations |
9 | 0360-0300 | 0 |
PageRank | References | Authors |
0.34 | 107 | 11 |
Name | Order | Citations | PageRank |
---|---|---|---|
Ling-Xi Xie | 1 | 429 | 37.79 |
Xin Chen | 2 | 0 | 0.34 |
Kaifeng Bi | 3 | 0 | 0.34 |
Longhui Wei | 4 | 140 | 7.98 |
Yuhui Xu | 5 | 12 | 5.00 |
Zhengsu Chen | 6 | 0 | 0.34 |
Lanfei Wang | 7 | 0 | 0.34 |
An Xiao | 8 | 0 | 0.34 |
Jianlong Chang | 9 | 0 | 0.34 |
Xiaopeng Zhang | 10 | 293 | 33.34 |
Qi Tian | 11 | 6443 | 331.75 |