Title
Bucket renormalization for approximate inference
Abstract
Probabilistic graphical models are a key tool in machine learning applications. Computing the partition function, i.e. normalizing constant, is a fundamental task of statistical inference but it is generally computationally intractable, leading to extensive study of approximation methods. Iterative variational methods are a popular and successful family of approaches. However, even state of the art variational methods can return poor results or fail to converge on difficult instances. In this paper, we instead consider computing the partition function via sequential summation over variables. We develop robust approximate algorithms by combining ideas from mini-bucket elimination with tensor network and renormalization group methods from statistical physics. The resulting 'convergence-free' methods show good empirical performance on both synthetic and real-world benchmark models, even for difficult instances.
Year
DOI
Venue
2018
10.1088/1742-5468/ab3218
JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT
Keywords
Field
DocType
machine learning
Renormalization,Applied mathematics,Mathematical optimization,Tensor,Partition function (statistical mechanics),Approximate inference,Statistical inference,Normalizing constant,Graphical model,Mathematics,Renormalization group
Conference
Volume
Issue
ISSN
2019.0
12.0
1742-5468
Citations 
PageRank 
References 
1
0.35
2
Authors
4
Name
Order
Citations
PageRank
Ahn, Sung-Soo144.53
Michael Chertkov246559.33
Adrian Weller314127.59
Jinwoo Shin451356.35