Title
A Unified DNN Weight Pruning Framework Using Reweighted Optimization Methods
Abstract
To address the large model size and intensive computation requirement of deep neural networks (DNNs), weight pruning techniques have been proposed and generally fall into two categories, i.e., static regularization-based pruning and dynamic regularization-based pruning. However, the former method currently suffers either complex workloads or accuracy degradation, while the latter one takes a long ...
Year
DOI
Venue
2021
10.1109/DAC18074.2021.9586152
2021 58th ACM/IEEE Design Automation Conference (DAC)
Keywords
DocType
ISSN
Degradation,Training,Deep learning,Design automation,Computational modeling,Optimization methods
Conference
0738-100X
ISBN
Citations 
PageRank 
978-1-6654-3274-0
1
0.37
References 
Authors
0
7
Name
Order
Citations
PageRank
Tianyun Zhang1316.42
Xiaolong Ma2225.90
Zhan Zheng354.59
Shanglin Zhou410.37
Caiwen Ding514226.52
Makan Fardad6111.70
Yanzhi Wang71082136.11