Title
CRYPTOGRU - Low Latency Privacy-Preserving Text Analysis With GRU.
Abstract
Billions of text analysis requests containing private emails, personal text messages, and sensitive online reviews, are processed by recurrent neural networks (RNNs) deployed on public clouds every day. Although prior secure networks combine homomorphic encryption (HE) and garbled circuit (GC) to preserve users' privacy, naively adopting the HE and GC hybrid technique to implement RNNs suffers from long inference latency due to slow activation functions. In this paper, we present a HE and GC hybrid gated recurrent unit (GRU) network, CryptoGRU, for low-latency secure inferences. CryptoGRU replaces computationally expensive GC-based $tanh$ with fast GC-based $ReLU$, and then quantizes $sigmoid$ and $ReLU$ with a smaller bit length to accelerate activations in a GRU. We evaluate CryptoGRU with multiple GRU models trained on 4 public datasets. Experimental results show CryptoGRU achieves top-notch accuracy and improves the secure inference latency by up to $138\times$ over one of state-of-the-art secure networks on the Penn Treebank dataset.
Year
Venue
DocType
2021
EMNLP
Conference
Volume
ISSN
Citations 
2021.emnlp-main
The 2021 Conference on Empirical Methods in Natural Language Processing
0
PageRank 
References 
Authors
0.34
0
4
Name
Order
Citations
PageRank
Bo Feng100.34
Qian Lou254.42
Lei Jiang341225.59
Geoffrey Fox44070575.38