Title
Improved Gradient-Based Optimization Over Discrete Distributions.
Abstract
In many applications we seek to maximize an expectation with respect to a distribution over discrete variables. Estimating gradients of such objectives with respect to the distribution parameters is a challenging problem. We analyze existing solutions including finite-difference (FD) estimators and continuous relaxation (CR) estimators in terms of bias and variance. We show that the commonly used Gumbel-Softmax estimator is biased and propose a simple method to reduce it. We also derive a simpler piece-wise linear continuous relaxation that also possesses reduced bias. We demonstrate empirically that reduced bias leads to a better performance in variational inference and on binary optimization tasks.
Year
Venue
Field
2018
arXiv: Machine Learning
Mathematical optimization,Inference,Binary optimization,Mathematics,Estimator
DocType
Volume
Citations 
Journal
abs/1810.00116
1
PageRank 
References 
Authors
0.35
0
3
Name
Order
Citations
PageRank
Evgeny Andriyash1122.80
Vahdat, Arash235318.20
William G. Macready316139.07