Title
Learning Markov Structure by Maximum Entropy Relaxation
Abstract
We propose a new approach for learning a sparse graphical model approximation to a specified multivariate probability distri- bution (such as the empirical distribution of sample data). The selection of sparse graph structure arises naturally in our ap- proach through solution of a convex opti- mization problem, which differentiates our method from standard combinatorial ap- proaches. We seek the maximum entropy re- laxation (MER) within an exponential fam- ily, which maximizes entropy subject to con- straints that marginal distributions on small subsets of variables are close to the prescribed marginals in relative entropy. To solve MER, we present a modified primal-dual interior point method that exploits sparsity of the Fisher information matrix in models defined on chordal graphs. This leads to a tractable, scalable approach provided the level of relax- ation in MER is sufficient to obtain a thin graph. The merits of our approach are inves- tigated by recovering the structure of some simple graphical models from sample data.
Year
Venue
Keywords
2007
AISTATS
chordal graph,maximum entropy,graphical model,fisher information matrix,relative entropy,empirical distribution
Field
DocType
Citations 
Entropy rate,Mathematical optimization,Maximum-entropy Markov model,Maximum entropy thermodynamics,Joint entropy,Artificial intelligence,Graphical model,Principle of maximum entropy,Machine learning,Kullback–Leibler divergence,Mathematics,Maximum entropy probability distribution
Journal
5
PageRank 
References 
Authors
0.63
10
3
Name
Order
Citations
PageRank
Jason K. Johnson120114.07
Venkat Chandrasekaran271637.92
Alan S. Willsky37466847.01