Title
Accelerating Message Passing for MAP with Benders Decomposition.
Abstract
We introduce a novel mechanism to tighten the local polytope relaxation for MAP inference in Markov random fields with low state space variables. We consider a surjection of the variables to a set of hyper-variables and apply the local polytope relaxation over these hyper-variables. The state space of each individual hyper-variable is constructed to be enumerable while the vector product of pairs is not easily enumerable making message passing inference intractable. To circumvent the difficulty of enumerating the vector product of state spaces of hyper-variables we introduce a novel Benders decomposition approach. This produces an upper envelope describing the message constructed from affine functions of the individual variables that compose the hyper-variable receiving the message. The envelope is tight at the minimizers which are shared by the true message. Benders rows are constructed to be Pareto optimal and are generated using an efficient procedure targeted for binary problems.
Year
Venue
Field
2018
arXiv: Learning
Affine transformation,Mathematical optimization,Random field,Inference,Markov chain,Algorithm,Polytope,State space,Surjective function,Message passing,Mathematics
DocType
Volume
Citations 
Journal
abs/1805.04958
1
PageRank 
References 
Authors
0.35
0
2
Name
Order
Citations
PageRank
Julian Yarkony1769.20
Shaofei Wang245.15