Title
Learning Mixtures of DAG Models
Abstract
Abstract We describe computationally efficient meth- ods for learning mixtures in which each com- ponent is a directed acyclic graphical model (mixtures of DAGs or MDAGs). We argue that simple search-and-score algorithms are infeasible for a variety of problems, and in- troduce a feasible approach in which param- eter and structure search is interleaved and expected data is treated as real data. Our approach can be viewed as a combination of (1) the Cheeseman–Stutz asymptotic ap- proximation for model posterior probability and (2) the Expectation–Maximization algo- rithm. We evaluate our procedure for select- ing among MDAGs on synthetic and real ex- amples.
Year
Venue
Keywords
2013
UAI'98 Proceedings of the Fourteenth conference on Uncertainty in artificial intelligence
cheeseman-stutz asymptotic approximation,simple search-and-score algorithm,acyclic graphical model,computationally efficient method,real example,dag model,expected data,feasible approach,expectation-maximization algorithm,model posterior probability,expectation maximization,graphical model,random variable,missing data,posterior probability,model selection,expectation maximization algorithm
Field
DocType
Volume
Computer science,Posterior probability,Artificial intelligence,Graphical model,Machine learning
Journal
abs/1301.7415
ISBN
Citations 
PageRank 
1-55860-555-X
29
32.34
References 
Authors
12
4
Name
Order
Citations
PageRank
Bo Thiesson123379.40
Christopher Meek21770248.06
David Maxwell Chickering32462529.52
David Heckerman469511419.21