Abstract | ||
---|---|---|
Bayesian decision trees are based on a formal assumption that the unconnected nodes are conditionally independent given the states of their parent nodes. This assumption does not necessarily hold in practice and may lead to loss of accuracy. We propose a methodology whereby na茂ve Bayesian networks are adapted by the addition of hidden nodes to model the data dependencies more accurately. We examined the methodology in a computer vision application to classify and count the neural cell automatically. Our results show that a modified network with two hidden nodes achieved significantly better performance with an average prediction accuracy of 83.9% compared to 59.31% achieved by the original network. |
Year | DOI | Venue |
---|---|---|
2002 | 10.1007/3-540-45683-X_42 | PRICAI |
Keywords | Field | DocType |
recognise neural cell morphology,hidden nodes,bayesian network,bayesian networks,hidden node,computer vision application,data dependency,formal assumption,average prediction accuracy,better performance,bayesian decision tree,original network,modified network,computer vision,decision tree,conditional independence,cell morphology | Cell morphology,Data modeling,Decision tree,Gradient descent,Naive Bayes classifier,Pattern recognition,Conditional independence,Computer science,Bayesian network,Artificial intelligence,Machine learning,Bayesian probability | Conference |
ISBN | Citations | PageRank |
3-540-44038-0 | 0 | 0.34 |
References | Authors | |
5 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Jung-wook Bang | 1 | 0 | 0.68 |
Duncan Fyfe Gillies | 2 | 97 | 17.86 |