Title
Nash: Toward End-To-End Neural Architecture For Generative Semantic Hashing
Abstract
Semantic hashing has become a powerful paradigm for fast similarity search in many information retrieval systems. While fairly successful, previous techniques generally require two-stage training, and the binary constraints are handled ad-hoc. In this paper, we present an end-to-end Neural Architecture for Semantic Hashing (NASH), where the binary hashing codes are treated as Bernoulli latent variables. A neural variational inference framework is proposed for training, where gradients are directly back-propagated through the discrete latent variable to optimize the hash function. We also draw connections between proposed method and rate-distortion theory, which provides a theoretical foundation for the effectiveness of the proposed framework. Experimental results on three public datasets demonstrate that our method significantly outperforms several state-of-the-art models on both unsupervised and supervised scenarios.
Year
Venue
Field
2018
PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL), VOL 1
Inference,End-to-end principle,Computer science,Latent variable,Hash function,Artificial intelligence,Generative grammar,Machine learning,Nearest neighbor search,Bernoulli's principle,Binary number
DocType
Volume
Citations 
Journal
abs/1805.05361
2
PageRank 
References 
Authors
0.36
31
7
Name
Order
Citations
PageRank
Dinghan Shen110810.37
Qinliang Su25510.07
Paidamoyo Chapfuwa321.04
Wenlin Wang422.05
Guoyin Wang5247.38
Lawrence Carin621.37
Ricardo Henao728623.85