Title
Melon: breaking the memory wall for resource-efficient on-device machine learning
Abstract
BSTRACTOn-device learning is a promising technique for emerging privacy-preserving machine learning paradigms. However, through quantitative experiments, we find that commodity mobile devices cannot well support state-of-the-art DNN training with a large enough batch size, due to the limited local memory capacity. To fill the gap, we propose Melon, a memory-friendly on-device learning framework that enables the training tasks with large batch size beyond the physical memory capacity. Melon judiciously retrofits existing memory saving techniques to fit into resource-constrained mobile devices, i.e., recomputation and micro-batch. Melon further incorporates novel techniques to deal with the high memory fragmentation and memory adaptation. We implement and evaluate Melon with various typical DNN models on commodity mobile devices. The results show that Melon can achieve up to 4.33× larger batch size under the same memory budget. Given the same batch size, Melon achieves 1.89× on average (up to 4.01×) higher training throughput, and saves up to 49.43% energy compared to competitive alternatives. Furthermore, Melon reduces 78.59% computation on average in terms of memory budget adaptation.
Year
DOI
Venue
2022
10.1145/3498361.3538928
Mobile Systems, Applications, and Services
DocType
Citations 
PageRank 
Conference
1
0.35
References 
Authors
0
9
Name
Order
Citations
PageRank
Qipeng Wang110.69
Mengwei Xu2668.32
Chao Jin310.35
Xinran Dong410.69
Jinliang Yuan510.35
Xin Jin633362.83
Gang Huang71223110.80
Yunxin Liu869454.18
Xuanzhe Liu968957.53