Abstract | ||
---|---|---|
Much of the current research in learning Bayesian Networks fails to effectively deal with missing data. Most of the methods assume that the data is complete, or make the data complete using fairly ad-hoc methods; other methods do deal with missing data but learn only the conditional probabilities, assuming that the structure is known. We present a principled approach to learn both the Bayesian network structure as well as the conditional probabilities from incomplete data. The proposed algorithm is an iterative method that uses a combination of Expectation-Maximization (EM) and Imputation techniques. Results are presented on synthetic data sets which show that the performance of the new algorithm is much better than ad-hoc methods for handling missing data. |
Year | Venue | Keywords |
---|---|---|
1997 | AAAI/IAAI | bayesian network |
Field | DocType | Citations |
Data mining,Conditional probability,Computer science,Iterative method,Bayesian average,Bayesian network,Artificial intelligence,Imputation (statistics),Bayesian statistics,Missing data,Synthetic data sets,Machine learning | Conference | 22 |
PageRank | References | Authors |
2.96 | 6 | 1 |
Name | Order | Citations | PageRank |
---|---|---|---|
Moninder Singh | 1 | 381 | 105.12 |