Abstract | ||
---|---|---|
Recently, edge-device training has arisen an urgent necessity since it can enhance the model adaptability without causing high transmission cost and privacy issues. Due to the need for a wide data range and high data precision to improve accuracy, DNN training requires much wider floating-point (FP) data for convolution and complicated arithmetics for batch normalization. They lead to massive comp... |
Year | DOI | Venue |
---|---|---|
2021 | 10.1109/AICAS51828.2021.9458421 | 2021 IEEE 3rd International Conference on Artificial Intelligence Circuits and Systems (AICAS) |
Keywords | DocType | ISBN |
Training,Technological innovation,Privacy,Power demand,Convolution,Random access memory,Transforms | Conference | 978-1-6654-1913-0 |
Citations | PageRank | References |
0 | 0.34 | 0 |
Authors | ||
5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Yang Wang | 1 | 1 | 1.03 |
Dazheng Deng | 2 | 0 | 1.69 |
leibo liu | 3 | 816 | 116.95 |
Shaojun Wei | 4 | 3 | 1.40 |
shouyi yin | 5 | 579 | 99.95 |