Title
Learning Structure of Bayesian Networks by Using Possibilistic Upper Entropy
Abstract
The most common way to learn the structure of Bayesian networks is to use a score function together with an optimization process. When no prior knowledge is available over the structure, score functions based on information theory are used to balance the entropy of the conditional probability tables with network complexity. Clearly, this complexity has a high impact on the uncertainty about the estimation of the conditional distributions. However, this complexity is estimated independently of the computation of the entropy and thus does not faithfully handle the uncertainty about the estimation. In this paper we propose a new entropy function based on a "possibilistic upper entropy" which relies on the entropy of a possibility distribution that encodes an upper bound of the estimation of the frequencies. Since the network structure has a direct effect on the number of pieces of data available for probability estimation, the possibilistic upper entropy is of an effective interest for learning the structure of the network. We also show that possibilistic upper entropy can be used for obtaining an incremental algorithm for the online learning of Bayesian network.
Year
DOI
Venue
2014
10.1007/978-3-319-10765-3_11
STRENGTHENING LINKS BETWEEN DATA ANALYSIS AND SOFT COMPUTING
Field
DocType
Volume
Information theory,Discrete mathematics,Mathematical optimization,Network complexity,Conditional probability distribution,Conditional probability,Computer science,Upper and lower bounds,Algorithm,Binary entropy function,Bayesian network,Score
Conference
315
ISSN
Citations 
PageRank 
2194-5357
0
0.34
References 
Authors
5
2
Name
Order
Citations
PageRank
Mathieu Serrurier126726.94
Henri Prade2105491445.02