Title
Challenges and Opportunities of DNN Model Execution Caching
Abstract
We explore the opportunities and challenges of model execution caching, a nascent research area that promises to improve the performance of cloud-based deep inference serving. Broadly, model execution caching relies on servers that are geographically close to the end-device to service inference requests, resembling a traditional content delivery network (CDN). However, unlike a CDN, such schemes cache execution rather than static objects. We identify the key challenges inherent to this problem domain and describe the similarities and differences with existing caching techniques. We further introduce several emergent concepts unique to this domain, such as memory-adaptive models and multi-model hosting, which allow us to make dynamic adjustments to the memory requirements of model execution.
Year
DOI
Venue
2019
10.1145/3366622.3368147
Proceedings of the Workshop on Distributed Infrastructures for Deep Learning
Keywords
DocType
ISBN
caching algorithms, deep learning, edge server
Conference
978-1-4503-7037-0
Citations 
PageRank 
References 
1
0.37
0
Authors
4
Name
Order
Citations
PageRank
Guin R. Gilman110.37
Samuel S. Ogden222.08
Robert J. Walls38510.19
Tian Guo410.70