Title
Low Latency Privacy Preserving Inference.
Abstract
When applying machine learning to sensitive data, one has to balance between accuracy, information leakage, and computational-complexity. Recent studies have shown that Homomorphic Encryption (HE) can be used for protecting against information leakage while applying neural networks. However, this comes with the cost of limiting the width and depth of neural networks that can be used (and hence the accuracy) and with latency of the order of several minutes even for relatively simple networks. In this study we provide two solutions that address these limitations. In the first solution, we present more than $10times$ improvement in latency and enable inference on wider networks compared to prior attempts with the same level of security. The improved performance is achieved via a collection of methods to better represent the data during the computation. In the second solution, we apply the method of transfer learning to provide private inference services using deep networks with latency less than 0.2 seconds. We demonstrate the efficacy of our methods on several computer vision tasks.
Year
Venue
Field
2018
arXiv: Learning
Computer science,Inference,Artificial intelligence,Latency (engineering),Machine learning
DocType
Volume
Citations 
Journal
abs/1812.10659
1
PageRank 
References 
Authors
0.35
0
3
Name
Order
Citations
PageRank
Alon Brutzkus111.70
Oren Elisha231.45
Ran Gilad-Bachrach3101.89