Title
Searching Toward Pareto-Optimal Device-Aware Neural Architectures.
Abstract
Recent breakthroughs in Neural Architectural Search (NAS) have achieved state-of-the-art performance in many tasks such as image classification and language understanding. However, most existing works only optimize for model accuracy and largely ignore other important factors imposed by the underlying hardware and devices, such as latency and energy, when making inference. In this paper, we first introduce the problem of NAS and provide a survey on recent works. Then we deep dive into two recent advancements on extending NAS into multiple-objective frameworks: MONAS [19] and DPP-Net [10]. Both MONAS and DPP-Net are capable of optimizing accuracy and other objectives imposed by devices, searching for neural architectures that can be best deployed on a wide spectrum of devices: from embedded systems and mobile devices to workstations. Experimental results are poised to show that architectures found by MONAS and DPP-Net achieves Pareto optimality w.r.t the given objectives for various devices.
Year
DOI
Venue
2018
10.1145/3240765.3243494
ICCAD
DocType
Volume
ISSN
Conference
abs/1808.09830
1933-7760
ISBN
Citations 
PageRank 
978-1-4503-5950-4
3
0.37
References 
Authors
17
10
Name
Order
Citations
PageRank
An-Chieh Cheng1132.93
Jin-Dong Dong261.74
Chi-Hung Hsu330.37
Shu-Huan Chang430.37
Min Sun5108359.15
Shih-Chieh Chang664152.31
Jia-yu Pan7115857.82
Yu-Ting Chen8103.23
Wei Wei920116.60
Da-Cheng Juan1019520.47