Title
Privacy-Preserving Deep Inference for Rich User Data on The Cloud.
Abstract
Deep neural networks are increasingly being used in a variety of machine learning applications applied to rich user data on the cloud. However, this approach introduces a number of privacy and efficiency challenges, as the cloud operator can perform secondary inferences on the available data. Recently, advances in edge processing have paved the way for more efficient, and private, data processing at the source for simple tasks and lighter models, though they remain a challenge for larger, and more complicated models. In this paper, we present a hybrid approach for breaking down large, complex deep models for cooperative, privacy-preserving analytics. We do this by breaking down the popular deep architectures and fine-tune them in a particular way. We then evaluate the privacy benefits of this approach based on the information exposed to the cloud service. We also asses the local inference cost of different layers on a modern handset for mobile applications. Our evaluations show that by using certain kind of fine-tuning and embedding techniques and at a small processing costs, we can greatly reduce the level of information available to unintended tasks applied to the data feature on the cloud, and hence achieving the desired tradeoff between privacy and performance.
Year
Venue
Field
2017
arXiv: Computer Vision and Pattern Recognition
Deep inference,Data mining,Data processing,Embedding,Computer science,Inference,Operator (computer programming),Artificial intelligence,Handset,Analytics,Machine learning,Cloud computing
DocType
Volume
Citations 
Journal
abs/1710.01727
0
PageRank 
References 
Authors
0.34
27
7
Name
Order
Citations
PageRank
Seyed Ali Ossia1232.09
Ali Shahin Shamsabadi2215.12
Ali Taheri3302.18
Kleomenis Katevas4395.89
Hamid R. Rabiee533641.77
Nicholas D. Lane64247248.15
Hamed Haddadi722322.94