Abstract | ||
---|---|---|
This paper presents a computationally efficient machine-learned method for natural language response suggestion. Feed-forward neural networks using n-gram embedding features encode messages into vectors which are optimized to give message-response pairs a high dot-product value. An optimized search finds response suggestions. The method is evaluated in a large-scale commercial e-mail application, Inbox by Gmail. Compared to a sequence-to-sequence approach, the new system achieves the same quality at a small fraction of the computational requirements and latency. |
Year | Venue | Field |
---|---|---|
2017 | arXiv: Computation and Language | ENCODE,Embedding,Computer science,Latency (engineering),Speech recognition,Natural language,Artificial intelligence,Artificial neural network |
DocType | Volume | Citations |
Journal | abs/1705.00652 | 8 |
PageRank | References | Authors |
0.53 | 19 | 9 |
Name | Order | Citations | PageRank |
---|---|---|---|
Matthew Henderson | 1 | 158 | 8.90 |
Rami Al-Rfou' | 2 | 1531 | 49.60 |
Brian Strope | 3 | 95 | 10.99 |
Yun-Hsuan Sung | 4 | 70 | 8.20 |
László Lukács | 5 | 49 | 1.92 |
Ruiqi Guo | 6 | 13 | 3.36 |
Sanjiv Kumar | 7 | 2182 | 153.05 |
Balint Miklos | 8 | 130 | 6.16 |
Ray Kurzweil | 9 | 47 | 3.49 |